Statistics chief tells MPs that public bodies show a ‘lack of confidence’ in dealing with businesses such as Google
Public bodies such as NHS trusts should be more assertive when dealing with businesses seeking data with which to train decision-making AI systems, MPs have been told.
Hetan Shah, executive director of the Royal Statistical Society, told the House of Commons Science and Technology Committee session on algorithms last week that custodians of public sector data often show a “lack of confidence” when dealing with companies seeking access to datasets, despite the “extraordinary value” of the asset.
“The public sector needs to recognise that it has the power in these negotiations and should exercise it,” Shah said.
As an example, he suggested that the Royal Free Hospital NHS Trust had been “seduced by the algorithm” when it was asked to provide details of 1.6 million patients for a project involving Google’s DeepMind system. The resulting deal led to a rap over the knuckles from the Information Commissioner’s Office last July.
The committee was collecting evidence on decision-making algorithms amid increasing concern that people’s lives can be fundamentally affected by decisions made by “black box” processes. It is likely to recommend new regulatory mechanisms to ensure that systems are transparent and decisions are open to challenge.
While the General Data Protection Regulation, to be implemented in UK law under the Data Protection Bill currently going through the House of Lords, will give individuals a right to opt out of having 'significant' automated decisions made about them, Dr Sandra Wachter of the Oxford Internet Institute told the committee this will be impossible to enforce unless people have a right to know how the decision is taken.
Bad actors
One option may be to require public agencies to publish the source code of algorithms involved in making decisions about citizens. Shah told the committee that New York City is legislating for such a requirement. Bulgaria already has such a law, Dr Adrian Weller of the Turing Institute added.
However, in its evidence to the committee Google strongly opposed this idea, saying it would help “bad actors” such as hackers and people attempting to game the system.
Another controversial area is whether algorithms should be regulated by existing watchdogs or an over-arching regulator is needed. Professor Nick Jennings of Imperial College said the job should be handled by existing specialist regulatory bodies, while Professor Louise Amoore of Durham University favoured a “sector agnostic” approach.
Witnesses agreed on the need to maintain public trust - and that citizens have what Wachter described as “a right to explanation”. Shah likened the potential row to that over genetically modified foods.
“There is a risk that… if we get the trust wrong, the democratic accountability wrong, we will lose this licence to operate,” he said.