Sat, 4 December 2021

Newsletter sign-up

Subscribe now
The House Live All
Home affairs
Home affairs
Home affairs
Transforming care for long-term conditions must be at the heart of the NHS recovery Partner content
Coronavirus
Home affairs
Press releases

We urgently need to tighten data protection laws to protect children from facial recognition in schools

We urgently need to tighten data protection laws to protect children from facial recognition in schools
3 min read

A little over two weeks ago, the news broke that facial recognition software in cashless payment systems – piloted in a Gateshead School last summer – had been adopted in nine Ayrshire schools. It is now clear that this software is becoming widely adopted, both sides of the border, with 27 schools already using it in England and another 35 or so in the pipeline.

Its use has been “temporarily paused” by North Ayrshire council after objections from privacy campaigners and an intervention from the Information Commissioner’s Office, but this is an extraordinary use of children’s biometric data for this purpose when there are so many other alternatives to cashless payment available.

We seem to be conditioning society to accept biometric technologies in areas that have nothing to do with national security or crime prevention

It is clear from the surveys and evidence to the Ada Lovelace Institute, which has an ongoing Ryder review of the governance of biometric data, that the public already has strong concerns about the use of this technology. But we seem to be conditioning society to accept biometric and surveillance technologies in areas that have nothing to do with national security or crime prevention and detection.

The Department for Education (DfE) issued guidance in 2018 on the Protection of Freedoms Act 2012, which makes provision for the protection of biometric information of children in schools and the rights of parents and children as regards participation. But it seems that the DfE has no data on the use of biometrics in schools. It seems there are no compliance mechanisms to ensure schools observe the Act.

There is also the broader question, under General Data Protection Regulation (GDPR), as to whether biometrics can be used at all, given the age groups involved. The digital rights group Defend Digital Me contends, “no consent can be freely given when the power imbalance with the authority is such that it makes it hard to refuse”. It seems that children as young as 14 may have been asked for their consent.

The Scottish First Minister, despite saying that “facial recognition technologies in schools don’t appear to me to be proportionate or necessary”, went on to say that schools should carry out a privacy impact assessment and consult pupils and parents.

But this does not go far enough. We should be firmly drawing a line against this. It is totally disproportionate and unnecessary.  In some jurisdictions-New York, France and Sweden its use in schools has already been banned or severely limited.

This is however a particularly worrying example of the way that public authorities are combining the use of biometric data with AI systems without proper regard for ethical principles.

Despite the R (Bridges) v Chief Constable of South Wales Police & Information Commissioner case (2020), the Home Office and the police have driven ahead with the adoption of live facial recognition technology. But as the Ada Lovelace Institute and Big Brother Watch have urged – and the Commons Science and Technology Committee in 2019 recommended – there should be a voluntary pause on the sale and use of live facial recognition technology to allow public engagement and consultation to take place.

In their response to the Select Committee’s call, the government insisted that there is already a comprehensive legal framework in place – which they are taking measures to improve. Given the increasing danger of damage to public trust, the government should rethink its complacent response.

The capture of biometric data and use of LFR in schools is a highly sensitive area. We should not be using children as guinea pigs.

I hope the Information Commissioner's Office’s (ICO’s) report will be completed as a matter of urgency. But we urgently need to tighten our data protection laws to ensure that children under the age of 18 are much more comprehensively protected from the use of facial recognition technology than they are at present.

 

Lord Clement-Jones is a Liberal Democrat peer and former chair of the Lords Artificial Intelligence Select Committee.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Lord Clement-Jones - Lords Diary: Lord Clement Jones

Categories

Home affairs