Skip to the content

Public services AI: grunt work, killer apps and data literacy

10/11/23

Mark Say Managing Editor

Get UKAuthority News

Share

Nigel Shadbolt
Nigel Shadbolt
Image source: AI Fringe

Interview: Nigel Shadbolt, executive chair of the Open Data Institute, says the public sector can contribute a lot to best practice in the safe use of artificial intelligence

Is there a potentially killer AI application for the public sector? Nigel Shadbolt suggests that if so it would be one that helps it better manage huge volumes of data that are currently difficult to re-use.

“Authorities have large amounts of content they have struggled to bring into the 21st century, and in terms of extracting it, summarising it, integrating it, these tools could help,” he says.

“One of the real problems for the public sector is that tons of its really important data was locked into PDF files and non-machine readable content. These (AI) systems have the ability to unearth that content then thread it together.

“We could use these tools to do a lot of the grunt work in data integration and identification, and once you’ve done that you can get onto the problems around issues like the environment and education.”

The executive chair and co-founder of the Open Data Institute (ODI) is speaking to UKAuthority on the day of the ODI Summit, and the week after the UK Government’s AI Safety Summit had raised the profile of the debate about the promise and risks of the technology.

Critical thinking

It also occurs soon after the ODI shared his thoughts on the need for data literacy, with the core message that it needs to be more widely spread, enabling more people to think critically about data in different contexts and examine the impact of how it is used.

Which leads to the question of whether he sees any specific requirements for the public sector in this respect?

“It’s where you try to interface the particular demands in the public sector with these tools,” he says. “If you think about a large amount of the workload, it has often to do with extensive summarisation of existing content, bringing various views on legislation or regulation to bear on the delivery of services, things that often demand quite a lot of professional support.

“You hope the tools could be helpful. I think you would have more of an orientation towards the requirements for safety and public service. It’s not driven by a profit motive but issues around equity and if this is actually delivering an inclusive result.”

He adds: “And as we use these systems, security issues arise, they may be prompted to disclose sensitive information. So you need stress testing on these models to ensure they operate in the way required.”

An important aspect of this is that the tools are no good without the appropriate skills. Some would be highly specialised in areas that are currently on the fringe.

“Fundamentally you’re aware there are whole classes that didn’t exist 20 years ago. One is the prompt engineer, who has the knack of prompting and refining the system to give answers that are way more relevant than you might have expected.”

Wider need

But much of it relates back to the need for wider data literacy, and it will have to run deep into organisations, with a culture similar to that for cyber security. Everybody from the highest level of decision makers to staff in front and back offices will need some familiarity with the tools and how they need to manage data.

The specifics of this are evolving and are likely to continue to change, so people will have to be ready to learn then refresh their skills as needed. Shadbolt points out that there are already books and crib sheets available, and advice from the industry, but adds that organisations such as the ODI could help to develop the training and potential career lines for new job roles in public and private sectors.

He points to availability of the International Certification of Digital Literacy (ICDL) – also known as the European Computer Driving Licence – and says there is scope for this to evolve or a new qualification relevant to AI to be established.

“The whole idea was getting people comfortable with and capable around the tools that were changing the workplace,” he says. “These were spreadsheets, databases, PowerPoint presentations, Word documents and file management.

“We’re now going to be living in a world where CoPilot (Micorosft’s AI assistant) is generally available and these tools are all going to be suffused with generative AI. It’s quite important in the same way we taught people not just to use the tools but their general nature, how they operate and their strengths and weaknesses.

“I think that on the AI side, you’re not going to teach everybody the details of the internals and algorithms, but convey critical thinking about how you use them.”

Into the mainstream

He also emphasises the broad range of capabilities that AI is beginning to bring. On a basic level, it is in AI elements being added to mainstream productivity tools from companies such as Microsoft and Google.

“That is where some of these products will be trialed and deployed in the first instance,” he says. “There is already an infrastructure in which these tools become embedded and used. And there will be a whole range of specific apps on your phone that use these services.”

In addition, the ODI is among the organisations looking at how it might develop relevant supporting products, dealing with issues such as how data is collected, represented, maintained and ethically licensed when it is used in AI.

There is also a great potential to refine and tune AI models to specific and highly demanding services, using reinforcement learning through human feedback you carve out a particular set of skills. This could include using it to identify and fill the gaps in the data literacy of a workforce.

“You could use the tool to carve out a course around the questions such as ‘What are the challenges around formatting data, interoperability or sharing data, and using it effectively?’

“Each of those questions, if you put it to a modern large language model like GPT4, will evince an interesting set of responses. You will then refine that in detail and say ‘Give me some material to illustrate the problems of data bias. How do I protect against it?’ The system generates the content, which has to be verified and signed off by human experts, but a lot of the heavy lifting could be done.”

Personalised education

Shadbolt suggests it could go as far as generating courses on specialist subjects such as data ethics or data engineering, and support personalised education in which an AI assistant is tuned to the user’s interests, routine operations and challenges in their jobs.

This is where organisations in specific fields, such as local government, healthcare and education, could contribute to developing the products and promoting data literacy, as they are well placed to bring up the questions that are routinely asked and specify what is needed in their contexts.

“You can imagine quite customised content for these different groups, but you have to start thinking now about the institutional frameworks so you have good representation and a degree of consensus around bringing people into that conversation,” Shadbolt says. “This isn’t something that should be handed down by a bunch of tech firms that think they know what service you need.”

The ODI has also highlighted a need for common practices to support a data infrastructure, aimed at filling a gap between technical aspects and the strategic business environment of an open and trustworthy data system. There are nine of them, covering accountability, privacy, security, standardisation, resourcing, capability, engagement, ethics and permissions.

Licensing and IP

The last one covers the permissions, or licensing, under which data is consumed and shared, and is coming to the fore as the deployment of AI is raising concerns about the source of data and intellectual property. Shadbolt says public authorities need to take the seriously.

“I think local authorities need to be thoughtful about under what permissions will they open up and share the data. It has to be something that serves the public as well as the private interest.”

But it also leads to a positive perspective: that the public sector has been dealing with these issues for some time, as many authorities have taken a lead in opening up much of their data for sharing and reuse. This has provided lessons that could be highly relevant in the permissions for consuming and sharing data.

It also reflects the intent and expertise within the sector to use data for the public good. This could make it a leader in establishing best practice in the application of AI.

“The public sector has been a leader in the past and, guess what, it’s back to the future in some respects on these issues,” Shadbolt says. “We’ve found fantastic expertise, commitment and enthusiasm among central and local government ninjas who just get this stuff, adopt it and get it out there.”

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.