Who is using algorithms for child-protection?

Several councils in England are using algorithms, which are using data from many sources, within the council and, in some cases, other agencies. This has caused controversy over privacy and misuse of data.

Why are they doing it?

People have designed these algorithms to identify children at risk of abuse. Some councils are using another algorithm to identify children whom criminal gangs are likely to try to recruit. The systems can look at more data in minutes than humans can in days. This will enable social workers to concentrate on high-risk cases with all the data they need to make sensible decisions.

What are the risks?
  • If councils put so much data together from diverse sources, a hacker could have a feast.
  • The data councils have tends to be about poorer families. There is a risk that they will not notice at-risk children in better-off families. [That is a risk at present. The use of algorithms doesn’t affect it].
  • There are questions of the legality and morality of using data for purposes different from those for which it was collected, although any data can be accessed in order to prevent or detect crime.
What are the two big risks?
  1. A council could take a child into care just because an algorithm said they should, perhaps wrongly stigmatising the parents.
  2. Social workers could become so reliant on the system that they ceased using their knowledge, training, experience or common sense.
Do those risks not outweigh the advantages?

Not necessarily.

  • The risk that exists at the moment is that a child can slip through the net due to lack of resources, human error or failure to bring together information held by different departments or other bodies. All too often, a child has died despite warning signs, because no one person saw all the signs. Sometimes, a doctor, a policeman and a teacher all had their suspicions, but knew nothing of each other’s concerns. Nobody told the social worker.
  • Some people rely on ticking boxes without thinking, even using manual systems. A human needs to interpret data, whether you are safeguarding children, investigating a crime or assessing an insurance claim. Used rightly, an algorithm should free social workers from repetitive tasks and allow them to think about the information produced.  People should take decisions. Not computers.
What’s this got to do with your business?

Most of us use statistics, and perhaps others should. We can mislead, or be misled, unless we learn how to use them properly. If you haven’t time to go on a course, try reading my little book, aimed at ordinary people, not mathematicians.

How to avoid being misled by statistics