UKAuthority talks with Russell Macdonald, chief technologist at Hewlett Packard Enterprise, about how a hybrid data strategy can be a catalyst for modernisation
It is possible to think of the UK public sector as a House of Data, containing immense stores of data from a myriad of sources used in millions of transactions, says Russell Macdonald, chief technologist at Hewlett Packard Enterprise (HPE). It provides a huge potential – especially when the data is shared between organisations – to create social value, providing better services, improving the lives of people and making the country a better place. But it comes with complex challenges.
“There are huge siloes of data trapped in legacy technology, applications and the cloud,” he says. “It’s like the data being locked in different rooms of the house.
“These have existed for a credible reason – to facilitate specific processes – but issues with legacy systems can make them difficult for other teams from inside and outside the organisation to access, even when they have a legitimate reason.”
This increases the complexity of the challenge, demanding a flexible and technologically sophisticated approach to collect, sort, interrogate and share the data for a myriad of purposes.
Changing characteristics
There is also a challenge in the basic characteristics of the data. It is constantly changing to reflect new types of transactions, a need for new attributes in understanding people and places, and the requirements of the organisations that hold it. There are also variations: two or more organisations can hold data that is essentially the same but labelled and stored differently to reflect how they need to use it.
All this requires a sustained effort to keep up with the changes and understand the data and where where to find it and – as conveyed in a series of documentaries on the House of Data – a number of important factors have to be taken into account.
A big one is around data retention. The public sector holds an ever growing amount of data, which brings an environmental cost in the energy used in storing it, and raises questions about how long it is legitimate to retain information on individuals.
Macdonald points out that much of the data is based on transactions, some of which could have a long term relevance to future activities, but by no means all. This creates the need for questions.
“How long do you need to keep the data after the transaction is implemented?" he asks. "There might be a realistic reason to keep it for auditing, but is there a point in keeping it for ever?”
Organisations need to work out and clearly define how long they should keep specific types of information, but this is not easy, especially as it impossible to foresee all of the future use cases for a dataset. It needs a realistic assessment of what data will be needed in the future, and how long after a transaction it should be retained.
Ethical issues
Issues around privacy and ethics also have to be taken into account. There has been a high profile debate on what data on individuals it is legitimate to collect, where is the line between a valid purpose and intrusion, and where and how it should be shared. The UK General Data Protection Regulation provides a legal framework, but there is scope for interpretation of its details, and this will be tested further as new types of data emerge.
Another long running challenge is the need for data to be structured to support the interoperability of digital systems – often a problem when different sources have developed their data with different priorities. Achieving interoperability requires a granular understanding of the data and a wider use of common features – such as a shared taxonomy – within the public sector.
A move towards collaboration between public and private sectors is adding to the complexity. Public sector bodies see the value in opening up some of their data for app developers, infrastructure companies and other service providers to use; and they can provide data that feeds into operations in areas such as traffic management, public health and environmental management. But this needs interoperability and has to stay within the privacy and ethical frameworks.
The emergence of AI is also significant. It comes with a huge potential for the delivery of new services, but to do so it also needs data with a reliable provenance and the right characteristics. Organisations will need to invest time and effort to ensure their data is fit for the purpose of AI; but the technology will also help them to sift and categorise the huge volumes of data that they hold.
All of these factors are already at work within the public sector House of Data, and it needs a clear view of the contents, of its easy access and the ability to move data between them.
A new strategy
Macdonald says part of the solution is in adopting a hybrid data strategy that identifies and draws from all these sources, joining them up wherever possible to develop better services and provide new streams of social value.
“We’ve seen more customers consider hybrid cloud as a choice, not just a model for when you couldn’t migrate everything to the cloud,” he says, adding: “Now the message is to have a hybrid data strategy.”
HPE provides the tools to make this possible. It provides software and technology products, all of which operate on a multi-platform basis. This makes them effective on-premise and in public and private cloud environments, providing strong support for a hybrid data strategy.
“It helps to joind up old and new worlds without centralising data into one platform,” Macdonald says. “And it buys you some time to decide where the data should sit.”
It also challenges the idea that the only way to modernise the use of data is to move it to the public cloud, believing that end-to-end public services are now delivered most effectively through an ICT estate that is balanced across public cloud, intelligent edge computing and on-premises systems. This is not practical in every instance, and many services draw on data held both on premise and in the cloud but delivered on local systems.
Scope for modernisation
HPE's solutions make it possible to modernise systems and services using data wherever it is held from edge to cloud. They facilitate a widespread joining up of data and strengthen the capacity for an organisation to identify a desired outcome then develop a solution to deliver this outcome from the data.
Macdonald concludes with the assertion that this can make it possible to fully harness the potential for social value from the House of Data.
“If we are able to look at data as something that exists everywhere and you formulate a strategy on that basis, it will provide a catalyst for modernisation.”
HPE has investigated these issues in more depth through a series of interviews with experts on the public sector’s use data. You can learn more through a series of three documentaries on the subject, covering: the value of data and citizen trust; the challenges in data hoarding, regulation and interoperability; and the future of AI and supercomputing. The documentaries are available here.