Science and Technology Committee calls for Centre for Data Ethics and Innovation to play an active role in providing safeguards for algorithms used in decision making
A group of MPs have expressed disquiet about the dangers of bias in algorithms that could be used by the public sector for decision making.
Parliament’s Science and Technology Committee has published a report on the issue that highlights the concerns, but also says the newly implemented General Data Protection Regulation (GDPR) should provide some protections for individuals.
It points to areas in which algorithms can make a positive contribution by using a much wider range of data than can be absorbed by people, such as in the NHS and policing. But it also reiterates the widely expressed concern that it can reproduce prevailing biases against some groups.
Algorithms give different weights to different pieces of information and this can reflect the bias of the people and institutions who create the programmes. There have already been protests that this has worked unfairly against black people in the justice system of some US states.
The report says that the Centre for Data Ethics and Innovation should examine biases in algorithms, looking to prevent the use of unjustified correlations and how to improve the ‘training data’ used. In addition, algorithm developer teams should include a sufficiently wide cross-section of society, or of the groups that might be affected by an algorithm.
The new body should also evaluate accountability tools, such as audits and certification of algorithm developers, and advise on how they could be embedded in the public and private sectors.
Optimistic note
The report also calls for transparency in the accountability of algorithms, and provides a note of optimism in saying the GDPR will provide helpful protections for people whose data is used. This will come largely through more explicit requirements for consent from individuals and data impact assessments by organisations.
However, there is still some uncertainty about how the regulation will be interpreted, and there will be a need for a framework that encourages innovation in the area while maintaining public trust.
Other recommendations include making public sector datasets available for big data and algorithm developers through new ‘data trusts’, which can make better use of the databases to improve service delivery.
The committee adds that the Government should publish and maintain a list of where algorithms are being used in central government, identify a ministerial champion with oversight of their use, and commission a review from the Crown Commercial Service to set out a model on public and private sector involvement in developing algorithms.
Opportunities and flaws
Committee chair Norman Lamb MP said: "Algorithms present the Government with a huge opportunity to improve public services and outcomes, particularly in the NHS. They also provide commercial opportunities to the private sector in industries such as insurance, banking and advertising. But they can also make flawed decisions which may disproportionately affect some people and groups.
“The Centre for Data Ethics and Innovation should review the operation of the GDPR, but more immediately learn lessons from the Cambridge Analytica case about the way algorithms are governed when used commercially.
“The Government must urgently produce a model that demonstrates how public data can be responsibly used by the private sector, to benefit public services such as the NHS. Only then will we benefit from the enormous value of our health data. Deals are already being struck without the required partnership models we need."
Image from Defense Advanced Research Projects Agency, public domain via Wikimedia