Socitm has urged local authorities to take a forensic approach to the use of algorithms in their services and warned against their use when they are not fully understood.
The public sector IT association has highlighted its position following the controversy over the Department for Educations about turn on the use of an algorithm to revise grades from England’s A level exams.
Its vice president Mark Lumley said there is a need for three steps, which should be approached forensically and not as a ‘tick the box’, to assess the use of algorithms.
The first step is to ask if the proposed approach is proportionate and ethical, which need forensic questioning.
Second comes planning for rigorous tests in an operational setting with the relevant data. Third is to underpin questioning and testing with robust checks and balances built into accountable governance.
Socitm president Sam Smith, who is also strategic IT lead for Cambridgeshire County and Peterborough City Councils, said: “There can be positive benefits from appropriate and well-structured use of algorithms and smart information systems. These benefits include streamlining services and enabling better outcomes for individuals and places.
“But there are also serious questions to be considered regarding transparency and public trust. GDPR gives people the right to challenge any decisions made by AI and be shown how the conclusion was reached. This further emphasises that the application and impact of algorithms must be fully tested and understood before they are pressed into use.”
Questions and warning
Lumley, who is also director of digital and IT at the London Borough of Hounslow, has gone into further detail in a blogpost written with William Barker, associate director digital ethics and cyber resilience at Socitim. It highlights the importance of asking questions, testing and reviewing the use of algorithms, and sounds a warning against using them without a full understanding.
“The stark conclusion emerging is that if you don’t fully understand how a program is working and what it will do if you use it with your data in your social, operational and political context, you are running a huge risk,” it says.
“If it will affect decisions about real people’s lives, you will want to think long and hard about whether using it is a good idea.”
Socitm’s warning has come days after BCS – The Chartered Institute for IT, reacted to the exams controversy with a call for standards to prove the ethics and competence of the use of algorithms and data science in public policy.
It published a report saying that information systems relying on algorithms are often a force for good, but it is hard to make them work as intended in high stakes situations.
Image from iStock, traffic analyzer