Children are an afterthought in our digital inclusion plans.
We talk about the importance of embedding diversity, inclusion, and ethics in technology as a prerequisite for a digital future that works for everybody. The conversation is framed in the context of identities – gender, ethnicity, sexual preferences, culture. However, we have forgotten children. I’m talking about children’s data privacy and their vulnerability to tech tools, especially those powered by artificial intelligence (AI).
In this article, I share four areas where we’re letting children down and how the power of framing data as money can help us to proactively include them.
Interacting with tech products that reject me as a user or provide a subpar experience elicits two very different responses in me.
As a Head of Customer Service with 25+ years’ experience in scientific and engineering software, I’m well aware of the constraints imposed by a finite R&D team and an ever-growing list of customer enhancement requests and bugs to fix. It’s teams like mine that build those lists and provide feedback to the product team on their prioritization. Which features and fixes make it into code depends on a multitude of factors: the difficulty to implement them, their alignment with the vision for the product, and their potential impact on the user experience and expectations. This last criterion is assessed using fictional user personas created by the product team as a representation of the ideal customer. The closer the requester of the feature is to one of the user personas, the higher the chances of implementation into the product. However, if the issue is considered an edge case – not representative of a substantial customer base – then it will mostly get rejected or postponed indefinitely. Every new feature and fix must demonstrate its ROI.
As a woman that cumulates several out-group identities – e.g. non-native English speaker, poor vision – I’m used to the frustrating feedback that my mediocre user experience is deceptively cataloged as an edge case. Why deceptively? The average tech Continue reading →
Before using the term diversity and inclusion advocacy, I had already identified the need for it. I’m a woman, STEM studies, work in tech, and I’ve been an immigrant all my life. This intersection of out-group identities has often resulted in being seen as the other. It has also prompted me to consciously endeavour to listen and empower members of other out-groups.
However, a little more than a year ago, I realized that, unconsciously, I was silencing those other voices.