Four ways we ignore children when discussing digital inclusion

Two teenage girls portrayed against a wall with multiple surveillance cameras pointing at them. The girls look at the cameras back. Image by StockSnap from Pixabay

(5 min read)

Children are an afterthought in our digital inclusion plans.

We talk about the importance of embedding diversity, inclusion, and ethics in technology as a prerequisite for a digital future that works for everybody. The conversation is framed in the context of identities – gender, ethnicity, sexual preferences, culture. However, we have forgotten children. I’m talking about children’s data privacy and their vulnerability to tech tools, especially those powered by artificial intelligence (AI).

In this article, I share four areas where we’re letting children down and how the power of framing data as money can help us to proactively include them.

Children’s digital discrimination

Key areas where technology fosters children’s disenfranchisement and manipulation:

1.- Databases are missing children’s data – I recently attended a meeting hosted by the Office for National Statistics (UK) summarizing the key takeaways from an Online Inclusive Data Consultation. As a background, I had represented on that consultation We and I, an NGO focused on the benefits and challenges of artificial intelligence for the UK public. 

A key insight from that meeting was that children are among the groups missing from the data and considered  critical data gaps:

“Children are another group that many identified as missing from the data. Where we do have data for them, this is often collected from people other than children themselves and therefore children’s own voices may not be heard. The Nuffield Foundation has identified a number of critical gaps in the data on children. This includes a lack of information on all areas of life for looked-after children as well as under-representation of children who have experienced abuse or neglect in early childhood and a lack of information on their outcomes.”

2.- Children’s perception of data – Hattusia – an organization helping to embed ethics into tech – is exploring why and how public conversations about personal data don’t work. Their research on metaphors is thought-provoking. They focus on how the way we talk about data shapes how we understand it. And children have a special place on their executive summary:

“Looking at children’s policy papers and discussions about data in Parliament since 2010, we identified three metaphor groups most commonly used to describe data and its properties:

  • Liquid/fluid: data can both flow and leak, just like a liquid or lakes.
  • Resource/fuel: data can be mined; can be raw; data is ‘the new oil’ or ‘fuel for the economy’.
  • Body/residue: data leaves a trace, like footprints. Our data is something that needs protecting.”

“When looking into how children are framed in data policy we found that they are most commonly represented as criminals or victims, or simply missing in the discussion. […] The language […] is alienating and dehumanises children into data points for the purpose of predicting criminal behaviour or to attempt to protect them from online harm. The voices of the children themselves are left out of the conversation entirely.”

3.- Tech as an enabler of children’s disempowerment– Children’s data and behaviour are increasingly at tech’s mercy.

(a) Children are constantly courted by social media companies that downplay, ignore, or simply declare themselves powerless to cope with the unintended consequences of the content they promote and that fosters user addition (Facebook whistleblower Frances Haugen testimony to Congresspausing the Instagram Kids appeating-disorder videos reach teens on TikTok).

(b) Children’s photos and videos are shared on social media by their loved ones who may not be aware that those images are treated as “content” that can be reused and sold by third parties. GDPR and other privacy provisions are meant to regulate the exchange of personal data between the owner of that data and those that receive it. What happens to privacy rights when the person sharing the data has custody of the little human to whom the data belongs?

4.- Digital policing of children – Recently, some schools in the UK started using facial recognition to speed up lunch queues. We were told that 97% of children or their parents authorized the measure. How well were they informed about the implications of their consent? Moreover, the biometrics company refused to disclose who else children’s personal information could be shared with.

Noticeably, the scheme was rolled out at one of the most deprived Council areas in Scotland, following the pattern seen worldwide to deploy this kind of AI surveillance tools in disadvantaged areas, using those users as involuntary early adopters.

The rollout was suspended following the concerns from parents and data ethics experts. However, children are increasingly targeted by artificial intelligence in school. They are monitored during their exams via webcams  that infer from their movements if they are cheating and their facial expressions and drawings are entered into mathematical models that claim to assess their mood. Anybody with dataset and a piece of code can now become a teacher, a neuroscientist, or a psychologist.

Money as a proxy

“Data is the new oil”

Clive Humby,
UK Mathematician and architect of Tesco’s Clubcard,
2006



What to do? Isolate children from the internet whilst they see their parents addicted to their smartphones? Give them access and use parental controls to lower the risk? Limit the number of hours they consume social media? And what about the digital choices adults make for them about their personal data?

When I was about 5, my parents started to give me a very small amount of money each Sunday. It was nice to have “my money” to spend in minor treats. In the long term, what was more important was its value as an educational tool. Through that weekly ritual and the conversations and experiences it prompted, I learned whom I could trust in money matters, when to save money and when to spend it, and – very importantly – that not because somebody asked for my money it meant I should give it to them. It also allowed me to exercise decision-making, responsibility, and agency.

The process was not painless or fast. It involved a lot of friction for my parents, teachers, and carers that had to cope with my never-ending list of “why”, “why not” and “how” questions. Of course, they also had to learn how to articulate the responses themselves.

This got me thinking that if we truly believe that indeed data is the new oil, there are some useful learnings we can transfer from how we teach children about money to the digital space.

For example, we don’t expect children to learn on their own about money (social media), we know it’s paramount that they understand how money (internet) works as members of a capitalist economy, and we don’t assume that because we are their guardians, we can break their piggy bank and give away their money (private data) with impunity.

What if we accepted responsibility for assessing if we’re entitled to share our child’s photo on Instagram, authorizing their biometric data to be captured by the school, or allowing an algorithm to diagnose their mood based on their drawings?  What if we proactively discussed with children their data privacy rights, the role of social media fostering divisions, and how the internet can be simultaneously a force for good and for evil?

I’ll finish with a provocation: Caring about children’s digital protection is by no means an act of charity but rather self-preservation. They are experiencing now what it’ll likely be the digital future for most of us.

How are you looking after your digital future?

1 thought on “Four ways we ignore children when discussing digital inclusion

  1. Pingback: Library of missing datasets: Are you being digitally excluded? - Patricia Gestoso

How does this article resonate with you?