Family associations in France have called on the government to put an end to ‘scoring algorithms’ that assess families for benefits and other state aid, in an open letter to Prime Minister Gabriel Attal.
The letter, reported by Le Monde newspaper, has called for “humanness” to be brought back into the increasingly-digital system used by the CAF (caisses d’allocations familiales) to run checks on families.
It was signed by associations including Changer de cap collective, and La Quadrature du Net association for the defence of digital freedoms.
The letter claims the algorithm processes used are discriminatory as they result in more checks on vulnerable groups - such as single parents, recipients of disabled adult allowance, and low-income households.
The writers said that the system contributed to “institutional abuse” against these vulnerable parties, and caused "multiple material and psychological consequences".
Instead, they have called for the government to stop using such algorithms, and for there to be stricter control of government IT tools, and greater public transparency about their use and workings.
The letter called for change in the CAFs, and other similar public bodies, such as the unemployment agency France Travail (formerly Pôle emploi).
‘Neither afraid nor ashamed’
Yet, in response, the CNAF’s (Caisse nationale des allocations familiales) director, Nicolas Grivel, said in an internal message in December: “We have nothing to be ashamed of in what we do [and I am] neither afraid nor ashamed of debate.”
In an audience with the Senate on January 25, he denied that the algorithm specifically targeted certain groups of people. He said: “We do not target single parent families, not at all.”
He said families in more fragile social positions had more inspections simply because they were more highly represented among benefit claimants.
However, Mr Grivel did not comment on suspicions about whether the algorithm targets some areas of the country more than others, and - according to Le Monde - did not reply to the newspaper’s requests in December last year that it reveal its methods for deciding who will be checked.
The CNAF has also been accused of not testing possible bias in the algorithm nor evaluating its results, despite having used the system for the past 10 years.