Hackney Council pays £360k to data firm whose software profiles troubled families
Hackney Council is refusing to release details about a profiling system which flags at-risk families in the borough to social workers.
The Town Hall has also confirmed that people whose data is captured through its services are not told it is being used for this purpose.
The council has paid a total of £361,400 to private data analytics company Xantura since 2015, and says it is trialling the firm’s Early Help Profiling System (EHPS).
It is not yet clear if all of the payments made to the company over the past three years, listed on the council’s website as either “consultancy fees” or “computer licences”, relate solely to this system.
The EHPS is designed to aid social workers by alerting them to families in the borough who may need extra support, with the aim being to prevent tragedies before they occur.
According to the council, as the EHPS is still in its trial phase, alerts have not prompted action by children’s services, and the system is “running alongside its usual work”.
A spokesperson said the decision to keep secret what predictive indicators are used to profile families is at the request of Xantura, which argues the information is commercially sensitive.
But Alexandra Runswick, director of campaign group Unlock Democracy, said: “Commercial sensitivity should not override democratic accountability.
“Private companies like Xantura – that knowingly get into the business of local democracy – should expect to be open about how software is being used for targeting.
“We know that human biases can be baked into profiling software design. This is a challenge that some of the largest companies in the world haven’t been able to iron out.
“Just last week Amazon scrapped its AI recruitment tool because it had an in-built bias towards male applicants. That gives all the more reason for software to be scrutinised by the public.
“Hackney Council’s failure to consult on the project, and now its refusal to share information on targeting, could undermine public trust.
“If people’s data is being used then they have a right to know how and why. What we are seeing in Hackney is symptomatic of a wider trend across the country of local democracy being whittled away.
“Councils are too cash-strapped to run consultations, and are cutting down to bare-bones services after a decade of hard-hitting austerity, all at the expense of a vibrant and inclusive local democracy.”
People whose details are processed by the system are not informed by the council that their data is being used in this way, according to a 2015 response to a freedom of information request on the service, with the council stating that this could prejudice the process.
The Guardian reported on Hackney and Thurrock Council’s piloting of the model in September, obtaining details of the latter’s modelling system, which included indicators such as domestic abuse, youth offending and truancy.
According to the Guardian‘s coverage, 350 families have already been flagged as in potential need of attention by the software in Hackney.
Wajid Shafiq, Xantura’s chief executive, was quoted on the system’s key benefits, saying its goal was to strike a balance between the protection of the vulnerable and the rights of the many, in order to avoid “letting down people who are vulnerable”.
Details of the methodology behind Xantura’s system can also be found on the company’s website, with a number of blogposts detailing the benefits, scope and basic operation of the software.
According to policy network Apolitical, which lists Hackney’s interim assistant director of children and young people’s services Steve Liddicott as one of its users, 80 per cent of the alerts generated by Xantura’s system have been deemed valid.
Liddicott is quoted as saying: “You actually don’t have to prevent that many children from going into care to make quite a significant saving, given that the costs per child per annum of a child coming into care are in the order of 50-odd thousand pounds depending on the nature of the placement that they are in.”
The system was originally designed to flag the twenty most at-risk families with whom social workers were not currently working, giving them a risk score and summarising to child services why they have been flagged for action.
In a response to the Guardian‘s coverage, Shafiq subsequently stated: “The challenge is how to use these technologies to drive data sharing that is open, proportionate and compliant with data protection legislation.
“In response to these challenges, we are working with our clients to develop an approach that only shares data about families if they are showing signs of distress and where there is already an agency or professional working with the family.”
Potential sources of information which have been previously reported as considered for use in the alerts are housing, police and school records, though it is not known which, if any, of these sources are utilised in Hackney Council’s model.
The software is also a potential source of revenue for the Town Hall, with Xantura advertising on its website that its tools can maximise “payments by results” under the government’s Troubled Families programme, which offers £1,000 to local authorities for every family referred.
A spokesperson for Hackney Council said: “We have been trialling this software and want to complete that trial and analyse results before sharing details and making a decision about its potential uses.
“Alerts do not prompt any action – we’re running it alongside our usual work, because we’re interested to see the opportunities it could provide.
“The reason for not sharing information about the data sets is due to a request from the provider relating to commercial sensitivity.
“No decisions are being made by this software. It suggests which children may be vulnerable and in need of additional support.
“The social worker receiving that information decides whether or not any additional support should be offered.
“We have been trialling the software to allow us to understand whether or not there are benefits which we’d wish to further explore.
“We are extremely mindful and alert to the potential for systems such as the one that we are trialling to be subject to bias and it is one of the things that we are considering in our reflections on the trial.”