Blog

Algorithms at work: what we can learn from the A-level results fiasco

Andrew Pakes · 21 August 2020

The fiasco around the calculation of A-level results earlier this month clearly has huge implications for the young people involved, but it also marks an important moment where the public have got their first real taste of what ‘rage against the algorithm’ might look like.

For those young people now contemplating what their future holds, the bad news is that whatever happens, these algorithms are set to play a role in their working lives for years to come.

It matters for all of us. The future of work is one in which data will determine how we are managed, what types of jobs there will be and how we are rewarded, either through pay, benefits or promotion. The question is whether we can start to tame these beasts and avoid the kinds of shambles we have seen over exam results.

Algorithms are hardly a new phenomenon, but as they become more mainstream and are hardwired into the fabric of our economy and society it is essential that we all become more literate in their use. The world of work was already changing before the pandemic. But the pace of that change has accelerated during COVID-19, with growing interest in work-based surveillance technology and monitoring software to help manage remote working.

For those of us with an interest in economic justice, data is becoming a fault line in inequality, separating those who have power over technology from the rest of us. It means that the same trust and accountability that has been talked about in relation to the exams fiasco and government use of algorithms must also to apply when employers deploy new technology or automated tools.

For trade unionists in particular, questions of algorithms and bias will be our meat and drink over the decades to come – almost equivalent to the role that health and safety played in the previous century.

They are already being used in recruitment, in performance reviews and elsewhere in the workplace, too often with disastrous consequences.

There is already evidence that workers are either unaware or being excluded from information and decision-making about workplace surveillance. Our own research at Prospect shows that most workers are unsure what data is currently collected about them by employers. We have to start educating ourselves, and fast.

The essential issue with any algorithm is that it reflects the biases of the assumptions and data that it is based on. For example, the A-level algorithm penalised students in larger class sizes, it didn’t take a genius to work out that the result would be state school students being marked down while private school students were marked up.

Algorithms at work

In a workplace context, the obvious example is recruitment algorithms that reflect the racial biases of those who design them and therefore only offer interviews to white applicants. Algorithms are also only as good as the data that is fed into them; if you put garbage in, you get garbage out.

We need to start thinking of data as part of our civil and economic rights. At its worst, as the exams fiasco showed, it is about unaccountable power leading to discrimination and injustice. But too often, especially at work, data is assessed in terms of business risks, not people.

One challenge is that the foundation block for GDPR and our current data rights is individual privacy. Whilst this is important it is insufficient in itself to tackle systematic bias or problems such as those highlighted by the A-level results. We need to develop a collective approach to our data rights alongside individual privacy. This is even more important with the contractual relationship at the heart of employment. Inequality and discrimination are about structures as well as individuals.

So, what do unions need to do to prepare ourselves for the age of the algorithm?

  1. We need to make better use of existing legal tools to test and scrutinise surveillance technologies. GDPR says that all new uses of our data should be subject to scrutiny by workers/unions. But how often does that consultation happen? We need to use GDPR and tools such as Data Protection Impact Assessments (DPIAs) to let the sun shine in on how employers are using our data.
  2. We need to build our knowledge on what new data, technology and automated processes are coming down the track, and to equip union reps to engage on the issue.
  3. We need to make data part our bargaining agenda, ensuring transparency and making algorithmic decisions and data collection everyday matters for collective bargaining.
  4. We need to go further and faster on exploring ways that workers can collect and utilise their own data, individually and collectively, so they can argue back against unjust algorithms.
  5. We need to campaign for improved workplace rights around transparency over the collection and use of employee data, and to establish new rules, such as on the Right to Disconnect and challenging the always-on work culture.

Prospect is working on these issues and helping to bring unions activists together on this debate.

For example, we are working with Uni Global Union and a consortium of experts led by Christina Colclough at the Why Not Lab to test and develop new approaches to give workers more access and control over their own data. . And we are learning from partner unions across the world about best practice when it comes to building data rights into collective agreements.

The rage against the algorithm we have seen over the summer only highlights the necessity of this work. This generation of students are the workforce of tomorrow, together we can make sure that their first experience of the injustice of algorithms does not become the norm as they enter the world of work.

Andrew Pakes is Prospect’s research director

Do you know what data your employer collects about you?

The growing role of new technologies in workplaces mean employers are accumulating rapidly increasing amounts of data on their employees, but many of us don’t know what this looks like.
Find out more