Police forces need a code of practice for their use of data analytics and algorithms, according to a new briefing by the Royal United Services Institute (RUSI) thinktank.
The paper, Data Analytics and Algorithmic Bias in Policing, says this derives from a lack of organisational guidelines and clear processes for scrutiny, regulation and enforcement.
The report was commissioned by the Centre for Data Ethics and Innovation in response to rising concerns about the risk of bias in algorithms in systems used by public services.
It says there are plenty of potential benefits in the use of analytics and algorithms in policing, but there are also significant risks, such as unfair discrimination on the grounds of protected characteristics, and real or apparent skewing of the decision-making process against particular groups.
It is not all down to data bias, but is related to organisational, operational and legal issues. This is particularly so for policing, which is often open to accusations of practices such as racial profiling.
The paper steers clear of referring to artificial intelligence in policing, saying it is poorly defined, and instead emphasises the use of algorithms in machine learning.
Creeping risk
There is a potential for bias creeping into the identification of problems and solutions – for example, in the conception of a gang around a particular section of the public – the design and testing, when using existing police data can reproduce existing biases; and in deployment, where a police officer could adhere to or deviate from the algorithm’s guidance depending on preconceptions.
RUSI says that any code of practice should establish a standard process for model design, development, trialling and deployment, along with ongoing monitoring and evaluation. It should also provide operationally relevant guidelines and complement existing authorised practice in a tech-agnostic way.
Legal and ethical requirements should also be taken into account, along with clear roles and responsibilities on scrutiny, regulation and enforcement. This is likely to involve roles for the National Police Chief’s Council and HM Inspectorate of Constabulary among other organisations.
Among the areas in which the technology is expected to be beneficial is predictive mapping, which can identify hot spot areas for the concentration of policing.
But RUSI says that academic experts interviewed for the paper expressed reservations on the ability of algorithms to predict future crime, especially in forecasting rare or unpredictable incidents. The more infrequent the crime, the less accurate the tool is likely to be.
Other organisations have been addressing the issue of algorithm bias. It was one of the features raised in a report by the Committee on Standards in Public Life, and the Information Commissioner’s Office has published draft guidelines to help organisations prevent bias.
Image by Markus Spiske, public domain