Menu
Fri, 19 April 2024

Newsletter sign-up

Subscribe now
The House Live All
How do we fix the UK’s poor mental health and wellbeing challenge? Partner content
Health
Communities
Mobile UK warns that the government’s ambitions for widespread adoption of 5G could be at risk Partner content
Economy
Environment
Economy
Press releases

We must end the use of biased algorithms to process visa applications with no transparency

4 min read

Cost-cutting algorithms can industrialise prejudice – their use for governmental decision making without any regulation should concern us all, writes Chi Onwurah


Last month, it was revealed that the Home Office has been secretly using algorithms to process visa applications. This is yet another example of government using technology, not to improve services and empower people, but to cut costs and exclude people.

According to the Home Office, the use of algorithms in visa processing is part of an efficiency drive. Algorithms are not being used to improve the quality of decision making, but to make up for a lack of resources within the department – a lack which has been compounded by the pressures of increased Brexit bureaucracy.

On 16 July, the APPG for Africa, together with the APPG for Malawi and APPG for Diaspora, Development & Migration, will publish our report into visa problems faced by African visitors to the UK – who are refused visas at twice the rate of other parts of the world. We have found that the quality and fairness of decision making is directly impacted by persistent under-resourcing of entry clearance sections, where entry clearance officers are set independent targets of up to 60 case decisions each day.

While I am a champion of the many ways in which technology can improve our lives, the Home Office visa processing system is broken, and cannot be fixed by technology on its own. These flaws are compounded by the use of algorithmic decision making, especially at such an early stage in its development.

Algorithms are not a neutral decision maker; they are reflective of their design and the data they are trained on. These limitations of algorithmic design are reflective of the narrow demographic from which many software engineers come – such as the facial recognition algorithm trained solely on white people subsequently identifying black people as gorillas.

By automating decision making, algorithms industrialise bias. In the Home Office, visa processing algorithms stream applicants according to their supposed level of risk. The government has claimed that they do not discriminate on the basis of race, but Home Office guidance for assessment of ‘genuine visitors’ allows consideration of nationality and statistics on immigration compliance from the applicant’s geographical region – often proxies for race.

This government has continuously refused to put in place a regulatory framework that reflects technology’s potential for harm as well as good.  Now, the Home Office has refused to give any details of the visa processing algorithm, the streaming process, or how risk levels are determined, making it impossible to scrutinise their role in decision making.

It is already known that automated decision making in the Home Office is biased in other areas. Residency checks in the EU settlement scheme uses a person’s Department for Work and Pensions footprint to establish residency, but fails to consider working tax credit, child tax credit or child benefit. These are all benefits more likely to be received by women, making these checks likely to discriminate against women – particularly those more vulnerable and without physical documents.

Algorithms already determine what we are advertised and what content we see. Without transparency, due process, or public consultation, the government has decided that they should also determine who is and isn’t allowed into the country. Given their ubiquity, the lack of a regulatory framework to protect us from algorithmic bias is highly concerning. The opaque use of biased algorithms by a government department responsible for life-changing decision making is even more so.

Labour would work with industry, local authorities, businesses, citizen groups and other stakeholders to introduce a digital Bill of Rights. This would give people ownership and control over their data and how it is used and enable us to hold companies and government to account. Labour’s plans would protect us from the cost-cutting measures of a government who do not care to understand the consequences of their technological choices, and allow proper transparency, scrutiny and regulation of their decisions.
 

The APPG report will be available on the APPG for Africa website from 16 July 2019. Chi Onwurah is Labour MP for Newcastle Upon Tyne Central and chair of the APPG for Africa

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Chi Onwurah MP - Innovation is key to making Britain a clean energy superpower

Categories

Economy
Podcast
Engineering a Better World

The Engineering a Better World podcast series from The House magazine and the IET is back for series two! New host Jonn Elledge discusses with parliamentarians and industry experts how technology and engineering can provide policy solutions to our changing world.

NEW SERIES - Listen now