Global Comment

Worldwide voices on arts and culture

Kiddle and other kid-safe internet tools don’t protect children

A screenshot of a Kiddle search for rape.

Online safety for children is a hot topic among parents who are negotiating a world where their kids will never know a life without the internet. How do we protect children from dangerous or unsuitable ideas, provide them with age-appropriate information, and ensure they can use the web in a way that empowers them?

These are questions that web developers attempted to address by creating the search engine Kiddle.

While not an official Google product, Kiddle uses Google search technology, which it has customised to deliver results that are suitable for children. Using Safe Search, Kiddle aims to provide search results that are “family friendly”, filtering sites with “explicit or deceptive content”. The first three sites returned for any search are generally aimed at children, results numbers 4-7 are written in a simple way that kids can understand, and results 8+ are “safe, famous sites” that may be harder for children to understand, filtered by Google’s Safe Search technology.

Kiddle also makes sure that the thumbnails and font are large, to cater for children, and the site does not retain any personally identifiable information.

All of this sounds wonderful, right? A cute, robot design and some well-thought-out criteria for search results that are suitable for children and young people can only be a good thing in the deep, dark, dangerous world of the web.

But what if the site is “protecting” children from information they need?

Like many gay people who grew up pre-internet, I envy young people for the amount of information on sexuality that they seem to have at their fingertips. That is not to say that today’s young LGBT people have an easy time – far from it – but as information is made available online, the answers to some of the questions I longed to ask someone can now often be taken for granted.

But young LGBT people are in trouble. 40% have considered suicide, 50% have self-harmed, and 20% have suffered physical assaults at school. In this context, services aimed at children and young people need to be proactive in making good, reliable information and support easy to access.

And this was quickly verified on Kiddle, which initially failed the test.

As Andrew Griffin discovered, and many more people went on to test, searches on Kiddle for LGBT-related keywords were met with the somewhat patronising response, “Oops, looks like your query contained some bad words. Please try again!”.

Dr Jill McDevitt went on to test a range of keywords related to reporting rape, understanding intimate partner violence, and questions about menstruation, each of which returned the same ‘bad words’ response.

These are search queries that need to deliver vitally important information for children and young people, and there are carefully written resources from sites like Scarleteen and Brook that take into account the young age of their readers. Children searching for information on what to do if their boyfriend hits them, or how to prevent STIs, really, really need the answers to their questions.

Stigmatizing children who need help is the worst possible form of support

Telling them, instead, that “gay” is a “bad word” does nothing to resolve their problems or reduce the stigma that LGBT people face.

Kiddle, to its credit, responded to the furore with a change. New searches for LGBT-related search terms began to elicit the following response:

“You have entered an LGBT related search query. Please realize that while Kiddle has nothing against the LGBT community, it’s hard to guarantee the safety of all the search results for such queries. We recommend that you talk to your parent or guardian about such topics.”

While this change was probably well-intentioned, it also missed the point. Many young people who are wondering about issues of sexuality or sexual health go to the internet precisely because they can’t talk to their parents or guardians about their questions or concerns.

A similar problem has been seen with the automatic blocking of ‘mature’ websites by mobile phone providers in the UK. Any adult with a new phone or contract has to prove that they are over the age of 18 to gain full access to the internet on their device, and a reported 90% of adults do this. This is not because we are a nation of porn addicts (though this may also be true), it’s because the range of websites that is blocked goes far beyond those depicting explicit sex.

Feminist and LGBT sites are amongst the most common topics blocked by phone companies. Other topics that must be blocked, by law, from under 18s (and therefore everyone until they can verify their age) include:

  • Suicide, Self-harm, Pro-Anorexia and eating disorders
  • Discriminatory language
  • Encouragement of Drug Use
  • Repeated or aggressive use of “cunt”
  • Pornography Restrictions
  • Violence and Gore restrictions

With businesses and educational institutions blocking access to these kinds of websites, too, and public WiFi services often doing the same, the Scunthorpe problem comes up repeatedly, highlighting the ridiculousness of some of the filters we are faced with. As ISPs come under increasing pressure by the government to block illegal content, they also block what some would consider to be simply unsavoury.

Should taste be an arbiter of what we are allowed to see?

And should it be an arbiter of what children are allowed to see?

Kiddle has responded again to calls by LGBT activists and has apparently relaxed its rules on LGBT-related content.

Search results for queries relating to trans, gay, lesbian and bisexual terminology now display relevant, age-appropriate results on the first page, and the world has not yet crumbled.

Other sensitive searches, such as “how to report rape” still return no results, but the ‘bad word’ error message has been replaced by a simple “Oops, try again!”.

This is not adequate, but it reduces some of the stigma of the previous cutesy retort.

Until children are not abused, search engines aimed at them that refuse to answer any questions about it are not doing their job. The internet defies borders and often defies the law, but it needs to be relied upon to provide relevant information, when possible, to support both the children and adults who need it.

I still ask Google for its help on matters from health to politics, and children should be able to ask Kiddle for the same. And, while it has improved its LGBT-related results considerably, balancing the dangers of abuse-related content with its potential benefits needs to be considered carefully and responded to.

One thought on “Kiddle and other kid-safe internet tools don’t protect children

  1. KidzSearch.com, which has been around for a long time as a safe search for kids, has a more open filtering policy while still maintain safety. If a search is blocked, additional resources are given to find an answer.

Comments are closed.