paint-brush
Tech, A.I & Kids: A Dangerous Mix?by@ThePourquoiPas
782 reads
782 reads

Tech, A.I & Kids: A Dangerous Mix?

by Adrien BookJune 9th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

How does modern tech affect kids in the long term? How do the two interact? Is it healthy? Science will have to wait decades to receive the beginning of definitive answers, but in the meantime, some insights are possible. We must realise that algorithms are now at the center of a digital world that’s changing the way experts think about human development. Some of the videos on YouTube Kids often have titles that don’t reflect their content. Some may argue then that Youtube for kids is the solution to the issues discussed above. Yet, this too is partly automated, and as such not be a guaranteed refuge from inappropriate videos.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Tech, A.I & Kids: A Dangerous Mix?
Adrien Book HackerNoon profile picture

Algorithms are now at the center of an online realm that’s changing the way the experts think about human development

A few months ago, Facebook early investor Sean Parker dubbed himself a ‘conscientious objector’ on technology, arguing that “God only knows what it’s doing to our children’s brains”. Despite the hypocrisy of his statement, this is indeed a rarely touched-upon matter: how does modern tech affect kids in the long term? How do the two interact? Is it healthy? Science will have to wait decades to receive the beginning of definitive answers, but in the meantime, some insights are possible. But first, a little game: it’s 7 P.M. Do you know what apps your children are using?

Is it Youtube? If so, read on.


Because Youtube is the ultimate 21st century nanny, many creators on the platform have aimed to monetise the fact that millions of children’s parents leave them in front of kid-friendly cartoons every day. Which, logically, led to the creation of A.I-generated videos, in order to scale and industrialise cartoon revenues. Below is an example of the type of videos, which just sort of… happened, free of any human input. Though it is far from threatening, it has frightening implications: an A.I has no goal, no ethos, no pathos, and no sense of wrong and right; we have no idea how this odd mix could affect the young ones in the long run. Add to this uncertainty the knowledge that Google is creating AI that can build AI, and we reach Dan Simmons levels of weird.

Some may argue then that Youtube for kids is the solution to the issues discussed above. Yet, this too is partly automated, and as such not be a guaranteed refuge from inappropriate videos despite its kid-friendly design. One troubling example of the type of videos that can be found on the platform features a surreal mix of content: Captain America dancing in a Candyland-style environment, the Incredible Hulk catching a crashing plane while a nursery rhyme plays, and Spider-Man and Frozen‘s Elsa engaged in a shoot-out with The Joker. All this created with the aim of making the video more attractive from an algorithmic perspective. Case and point, some of the videos on YouTube Kids often have titles that don’t reflect their content. More often, they are simply lists of common search keywords such as “learn colors” and “nursery rhymes”.

Even without taking into account the potentially disturbing content, we must realise that algorithms are now at the center of a digital world that’s changing the way experts think about human development. There isn’t a human handpicking the best videos for toddlers to watch, and, much like Facebook’s, YouTube’s algorithm aims to make viewers obsessed (think Pavlov). Could this alter one way or another children’s cognitive development? The jury is still out, but it’s worth a thought given how much the wee ones spend on the phone.

Some worrying trends are also appearing on Snapchat, as the app’s privacy makes creeps tough to track. Ill-intentioned adults know that they can reach kids on Snapchat, as they know this is the popular app for that age-group (but won’t be for long), and that few parents control it nowadays. They use it to both contact kids and send inappropriate images to each other, as they are much harder for authorities to track than they’d otherwise be via email or chatroom. This also applies, to a much lesser extent, to Twitter and Facebook.

Because of these recent developments, researchers are busy tracking and analysing every aspect of the web’s effects on kids’ social behaviours, mental health and even physiological development, leading to some great publications with regards to the effects of cyber-bullying, revenge porn and trolling, while law enforcement and various governmental bodies study the rise of cyber-savvy pedophiles, criminals and the depression epidemic. Everywhere one turns, grown-ups are volubly voicing their anxieties about the smartphone generation (better late than never, right?). Because of various breakthroughs in this field, parents now know that they have to work incredibly hard to protect their kids’ imaginations from predatory, addictive websites that want to sell things to them — or sell them to advertisers.

Yet, there is a limit to how much parenting is needed in these situations. The role of a parent is not that of a security drone, assessing every move (drone parenting = helicopter parenting but with more collateral damage). Parents have to lay the groundwork, starting conversations about what’s real and what’s not, what’s dangerous and what’s safe. The youths will have to do the rest. Walking home from school or in their rooms at night, when they have escaped the teacher and aren’t under the control of their parents, that’s where they go on to digital platforms and try things out with friends, hang out and experiment, and be independent, free to create their own selves in a limitless world.

The logic behind paranoid parenting is both simple and understandable: the latest generations must be exceptionally protected because we live in exceptional times. This has led to outrageous rules and headlines, and the a loss of freedom for an entire generation. Children are rarely allowed to use tools, are often told they can’t play outside without the presence of an adult, and they certainly can’t be expected to use the internet unsupervised.

Yet, not allowing a certain level of freedom will (and has) lead to the creation of a fragile generation. This is why we have “safe spaces” in schools and millennials missing adult milestones today. An entire generation of kids was told that they’d never be too safe — and they believed it. By trying to keep children safe from all risks, obstacles, hurt feelings, and fears, our culture has taken away the opportunities they need to become successful adults. In treating them as fragile — emotionally, socially, and physically — society actually makes them so, despite regular proof that children are capable of immense courage and strength. If kids don’t learn to wobble, they never learn to walk; they end up standing still.

Join a movement

This article was originally made for The Pourquoi Pas, an online magazine providing in-depth analyses of today’s technological challenges. Click here to access it.