Our research into predictive algorithms reveals a hidden revolution taking place in the UK’s public services.

Councils and police forces are beginning to exploit the vast seams of data at their disposal to target their services with computer models.

There is very little oversight of this new field, with neither central nor local government providing lists of where these systems are being introduced, how they are being used and how citizen data is being shared.

As a result, the finding that 14 police forces and 53 councils are using predictive algorithms advances our knowledge on this topic a great deal.

Still, it leaves crucial questions unanswered.

What exactly are council and police forces predicting? And, when they do make predictions, what does it look like on the ground?

The answer to the first question is simple: public services are predicting anything they can.

:: Add to Apple podcasts, Google Podcasts, Spotify, Spreaker.

Our research, conducted with Cardiff University’s Data Justice Lab, found predictive algorithms being used for applications as various as traffic management, housing benefit and identifying children at risk of neglect or abuse.

The second question is much more complicated. Aside from the men and women that run these systems – who usually operate under a veil of commercial confidentiality – few people have seen them in the wild.

Very often, the people affected don’t even know their life has been touched by an algorithm.

So, for the last two-and-a-half months, my producer Valerie Hamill and I have been criss-crossing the country, trying to find an answer.

As we went, we encountered one all-pervasive myth and one largely hidden truth.

First, the myth: predictive algorithms are like Minority Report.

Despite having been released well over a decade ago, this single Tom Cruise film still totally shapes popular understanding of predictive algorithms.

I don’t want to say that every single person we met mentioned it… but it was pretty close.

:: Debate – A force for good or an invasion of privacy?

Happily (or sadly, depending on your point of view), the reality is a long way from pre-crime.

The algorithms being employed in councils and police forces don’t, for the most part, use artificial intelligence, and they aren’t followed up automatically. They flag suggestions to council workers, who then choose whether or not to take the recommendation.

That doesn’t mean they’re not extremely powerful, as personal experience will show.

Ever bought something on Amazon, based on its recommendation about what you might like? Or friended someone on Facebook after it suggested someone you might know? Or clicked in a recommended video on YouTube?

In that case, you’ve been influenced by a predictive algorithm.

This often invisible power is why it’s so important to build extensive safeguards into automated systems, to ensure that there are humans available to spot and clear up issues and talk anyone affected through their problems.

That’s doubly when the data is derived from the incomplete record of the past, with all its flaws and false assumptions.

The council officials and police officers we spoke to were extremely conscious of this danger. Without fail, they were friendly, open and deeply concerned with citizens’ well-being.

I felt, in a way I almost never do when speaking to big tech companies, that they were alive to the risks of their own technology.

Yet, unlike big tech companies, public services have no money, and equally little room for manoeuvre. As a time when they need to be most precise and careful, they are stretched close to – and in some cases beyond – breaking point.

They are introducing programmes on the promise that they will cut costs, when in fact they should be spending more to smooth the transition to the new regime.

Which brings us to the hidden truth about predictive algorithms: they are less a sign of new technology, more a signal of the next phase of austerity.

This came through loud and clear at every place we visited.

Time and again, council officials and police officers explained that their most urgent need was to reduce resources. With cuts of more than 40% since 2010, they are down to the bone. The potential of algorithms to cut costs is just too alluring to resist.

An advocate of these technologies might say that austerity is giving public services the push they need to move into the twenty-first century. And there are definitely benefits to be gained from using council data more effectively.

But, without proper safeguards, these systems risk exacerbating existing hardship, by subjecting the poorest and most vulnerable to a system of discipline without transparency, accountability, or, most importantly, humanity.

In his report on poverty in the UK, Philip Alston, the UN’s rapporteur on extreme poverty and human rights, warned that, unnoticed, the government was creating a digital welfare state.

If my experience in the last two months is anything to go by, it would be more accurate to say it is automating austerity.

(c) Sky News 2019: Predictive algorithms: Hidden revolution taking place in UK’s councils and police forces