Thu, 18 April 2024

Newsletter sign-up

Subscribe now
The House Live All
Mobile UK warns that the government’s ambitions for widespread adoption of 5G could be at risk Partner content
Raising attainment and alleviating challenges in schools Partner content
Speed of delivery should be front and centre in the UK’s drive to be a Life Sciences leader Partner content
Press releases
By Mobile UK
By Mobile UK

Education Revolution - why aren't we teaching AI in schools?

7 min read

It’s becoming ever clearer that AI will shape every part of our future much sooner than we could have predicted, from how we work and live to how we fight disease and wage war. So why aren’t we teaching it in schools? Justine Smith reports. Illustrations by Tracy Worrall

"AI is not just a teaching tool or a subject for future scientists,” says Conservative MP Stephen Metcalfe, co-chair of the All-Party Parliamentary Group on Artificial Intelligence. “This is the biggest disruption since the Industrial Revolution. In the same way that [it] changed every aspect of the way we live, AI will have the same impact from the moment we’re born until the moment we die. This is happening now, there is no time to lose.”

While there is a proliferation of higher education courses teaching and committing to research in machine learning, artificial intelligence, algorithms, neural networks, data mining and other digital skills, there is nothing currently in the schools curriculum that scratches the surface.

In May, Labour peer Lord Knight of Weymouth, chief education adviser at digital education company TES Global Ltd, told peers an A-level computer science student had that weekend told him he was learning about long-obsolete floppy disks but not AI. A group of 60 headteachers also wrote an open letter to the government saying schools were bewildered by the fast-moving technology, and did not trust big tech or the government to lead them out of the darkness.

“The United Kingdom is the third country after the US and China in terms of the development and deployment of AI,” Metcalfe added. “But we haven’t really looked at what we are doing in education compared to other countries and we really need to do that. There’s no point in children learning the kings and queens of England by rote because they have the history of the world at their fingertips in their smartphones, where they are in contact with algorithms and AI on a minute-by-minute basis. They need to learn to interrogate, analyse, and validate that data.

“We don’t just need to change what we teach them, we also need to fundamentally change how we teach them.”

Last month Education Secretary Gillian Keegan announced a Digital and Computing Skills Education Taskforce to address skills gaps and new A-level equivalents in STEM subjects including digital business services which will be available from the new academic year in September.

Unveiling the plans at the 10th London Tech Week, she said: “Digital skills matter. As tech accelerates, they’re likely to become as important to a person’s employability as English and maths, eventually being on a level pegging with those two core subjects.”

However experts say this does not go nearly far enough in building skills into all core subjects from a much earlier age – and nor does the government say how it will find the expertise to teach it.

Philip Colligan, CEO of the Raspberry Pi Foundation, a Cambridge-based charity which provides free resources and programmes to teachers and children, says it takes 200 hours to prepare a teacher properly.

“I don’t think any education system anywhere in the world is really getting ahead of the challenge of how do we equip young people for the AI future,” he said. “Pretty much every computer science lesson and lesson about AI will for many years be taught by a teacher that doesn’t have a qualification in that subject. It’s not part of teacher training and very few tech graduates go into teaching. This is a massive challenge. How do we support non-qualified teachers in what is a massive and rapidly-changing subject?”

AI in schoolsIn 2013 Oxford University Economics Professor Carl Benedikt Frey co-wrote a seminal text using machine learning to audit the Future of Work, predicting that 47 per cent of today’s jobs were at risk of being displaced by AI.

A decade later he says we are still on course for that but doing very little to equip our future workforce or economy for the new jobs it will create.

“I have seen no evidence that the UK government has a clear plan to prepare young people for the near future,” he warned.

“The first part of the work that needs doing, and urgently, is understanding what skills are needed for the future; the second part is how those skills are provided.”

So far the Department for Education has focused on teaching younger children coding which is relatively straightforward – but has this been a waste of valuable time and resources? After all, only a few jobs will require it and OpenAI’s large language model ChatGPT is quickly perfecting the art.

“This illustrates the problem with jumping on trendy bandwagons and failing to see the bigger picture,” says Frey. “We have a tendency to teach very tangible, very practical skills and we should teach more foundational and generic skills.

“Our study found three areas where machines cannot do the job of a human. One relates to creativity, the ability to come up with novel ideas and artefacts; the second relates to complex social interactions such as managing teams, motivating people, negotiating deals, persuading people; the third was the perception and manipulation of irregular objects and the ability to navigate complex environments. We don’t have any complex robots that can clean our homes. They would not be able to, for example, identify a piece of waste paper from an important document.”

He said children should be honing essay writing and debating skills to learn to distil and crystalise vast amounts of information and called for recognition of the value of creativity throughout the curriculum.

“We now live in a data-driven world so having skills for statistics and econometrics and machine learning is also going to be crucial,” he added. “Not everybody is going to be an expert in those domains but it’s good to know what AI can or cannot reliably do because we are all going to work with it in one way or another.”

More importantly, perhaps, how do we prepare the next generation to take on the burden of moral and practical responsibility for this terrifying and unpredictable technological trajectory?

In recent months, we have been warned that we are sleepwalking into every dystopian sci-fi film plot, with mass unemployment, machines outsmarting us and asserting their will, terrorists using the technology to create new viruses and weapons and catastrophic malfunctions.

The splitting of the atom brought nuclear power which I think is a good thing, but it also brought the atom bomb

Sam Altman, CEO of OpenAI, calls for a new discipline, “the science of human guidance”, to help us co-pilot it safely.

Anil Seth, professor of cognitive and computational neuroscience at Sussex University, said: “I worry that the focus on this click-bait existential AI risk is diverting our attention away from the clear and present dangers: misinformation, disinformation, eroding our trust in everything and anything. A working society depends on us all having some sort of shared reality based on the same information.

“There’s already unintentional AI bias because of how existing systems have been designed and trained, leading to all sorts of things that are problematic right here and right now. This highlights a need for ethical education for AI developers to mitigate these problems in the future”

But he would not join Tesla entrepreneur Elon Musk and others in calling for a global pause on development.

“The splitting of the atom brought nuclear power which I think is a good thing, but it also brought the atom bomb,” he said. “There is always a dual use to new technologies. We can use AI and machine learning to mitigate the real existential crises we face: another pandemic, climate change, nuclear accidents. We just need to ensure we don’t create a new one in the process. This is where we should be guiding and inspiring our children.”

Over at Raspberry Pi, Colligan is already seeing young people apply their newly-learned skills in AI to safeguard their own future.

“We run lots of open competitions and challenges for kids and you get a sense of what really matters to them,” he said. “Climate change and biodiversity universally come out on top. They know that’s a problem they are going to have to solve as adults. You see time and again kids using technology to sort these problems out.”

“We can help kids from all backgrounds learn that this isn’t magic done by wizards in Silicon Valley, this is something you can use to solve problems in your life or to get brilliant, high-paid jobs, to build businesses that will change the world.”

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Justine Smith - Profit and loss: How children's care is failing our most vulnerable


Schools AI


Education Technology