Protecting children from online sexual exploitation and abuse

Posted On: 
3rd September 2018

Social media has given incredible freedom to millions of people worldwide. But with this freedom comes danger, especially for young and vulnerable people

Recent news stories have highlighted some of the perils of chatting online, particularly the risk of child sexual exploitation and abuse.

That’s why PA has been working with members of the WePROTECT Global Alliance and industry partners to develop new ways to protect children online. Leading a group of companies with a range of expertise, we explored new ways of educating children to spot online dangers.

In the same way that banks and credit card companies alert customers when they suspect fraudulent transactions, we found we can use artificial intelligence (AI) to send children automated alerts when dangers are detected in online chats.

One of the big questions of using technology in this way has been how to keep alerts consistent and effective without causing undue alarm or being ignored altogether.

To overcome this, we devised five principles that ensure chat providers offer real-time, relevant risk information.

Our principles focus on educating children about the dangers. That means they won’t just protect vulnerable people online, they’ll help a whole generation develop digital resilience.

1. Instant

An alert is only useful if you see it, and in today's busy digital environment, every screen element is vying for your attention.

To combat this, alerts should be shown in situ as soon as the risk is detected. In practice, this could be a chat bubble appearing in the conversation, or a confirmation button appearing before an image is sent.

2. Specific

People quickly become immune to general warnings, alerts and notifications, which could be more harmful as they think they have more protection than they do. So, alerts must be specific to the user, their situation and their current chat.

This means all alerts should briefly outline the risk and only be shown to the child or vulnerable person.

3. Relevant

Cybersecurity awareness has taught us all to be wary of anything that doesn’t fit into the look and feel of a page, so alerts must be as engaging as the chat itself to avoid being ignored.

They should be in the same style as the chat so the severity of the alert is interpreted correctly and there’s no suggestion that someone else is watching. This also avoids alert fatigue, which could make children ignore the service.

4. Private

Technology that keeps people safe must be trusted. To help this, alerts must trust children to do the right thing by educating them and giving them the option to make a decision.

The alerts also need to be kept separate from reports to law enforcement. Alerts don’t excuse a chat provider from any legal reporting requirements in their country but adds a layer of online safety.

5. Supportive

Alerts are an educational tool that should help change behaviour. They should, therefore, give the information needed to make an informed decision about what to do next.

However, online risk isn’t simple and it’s unlikely any alert will offer enough information, so links should be included where a child can find out more or connect with people who can help.

If they’re left with questions, or uncertainty over the reason for the alert, the person who triggered the warning will answer this doubt or the child will ignore it completely.

Our principles in practice

We've tested these principles in our Concept Demonstrator, honing the guidance we give to chat providers so we can keep as many children and vulnerable people safe as possible.

When companies follow our alert principles, children will be able to relate specific actions with specific risks, rather than a general concern around being online, helping them learn.

Children will also have access to an impactful, real-time online safety tool. It’s not enough for companies to hide online safety information in terms and conditions or a corporate website.

And children won’t be influenced by the person putting them in danger, empowering them to make their own decisions.