Algorithms are undermining efforts to eradicate racism
One and a half years ago, the #BlackLivesMatter protests over the terrible killing of George Floyd shone a spotlight on racism. Last year, in a parliamentary debate to mark Black History Month, I spoke of the possibility of eradicating racism.
“I look forward to a day when parents will explain racism to their children in the same way that they now explain hanging, drawing and quartering: as a barbaric practice of our past... We want a world in which success is open to all, and Black History Month can help to achieve that by remembering all our history in colour and making racism history.”
A year on, I am not nearly so optimistic. Racial justice seems once again to be a niche interest. And rather than being eradicated, racism is being entrenched – by algorithms.
An algorithm, like the ones used to calculate grades in last year’s A level results, is just a set of instructions which acts on data entered in a particular format to make a decision. Increasingly they control our lives: telling us what to buy and when to go to sleep; who to fire and hire, who to vote for, or grant a visa to.
Software engineers tend to come from a very narrow demographic
Critically, algorithms are only as good as their design and the data on which they are trained. Software engineers tend to come from a very narrow demographic – few are women, from ethnic minorities or working class. Algorithms reflect the limitations of their designers.
And the data algorithms are trained on too often has limitations. Six years ago, Google’s photo recognition algorithm identified Black people as gorillas because only white people had been used to train it, while Facebook’s ad-delivery system is biased against Black people and women for both recruitment and real estate.
Design rules, oversight and accountability can help protect against such biased outcomes, but right now there are no regulatory requirements and no business incentives to do so.
Racism is not an equation; it is a lived reality. But the Sewell report commissioned by the government proposes the solution to algorithmic bias is to “define fairness mathematically”. Fairness is a social and political problem first and foremost. Perceiving it as a mathematical problem that can be coded out of existence means the investment in people, resources and regulation necessary to eradicate it will never be forthcoming.
That lack of understanding applies to the Online Safety Bill currently before Parliament in draft form. The previous secretary of state for digital often said he wanted what is illegal offline to also be illegal online. There are at least four problems with that.
First, you cannot separate the world into online and offline as simply as Oliver Dowden believed. Second, online drives offline – public acts of racism may have their roots in online material promoted by algorithms, and racist far-right “pile-ons”, like those experienced by many of England’s Black footballers, are promoted to other racists by platform algorithms.
Third, some things are not illegal offline but their harm is industrialised and therefore amplified online. And finally, why limit ourselves to the failings of the offline world rather than seek to create a better world online, which can drive a better one offline?
The Online Safety Bill’s definition of harm is vague, mostly left to the platforms, and does not consider algorithms at all. The bill will be out of date before it is even law.
Tech is neutral. That is one of the reasons I wanted to become an engineer and spent more than two decades in tech. People could call me names and deny my qualifications but the code I wrote either worked or it didn’t; the signal processing I designed did the job or it didn’t. It was bias-free, unlike the people around me. Unfortunately, it appears that is no longer the case. And this government hasn’t even begun to deal with it.
Chi Onwurah is the Labour MP for Newcastle upon Tyne Central and shadow digital, science and technology minister.
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.