Headlines around penalties for nuisance calls and a boost for innovators and business accompanied last week’s announcement of proposals for a new data regime in the UK.
But as the Department for Digital, Culture, Media and Sport (DCMS) published its consultation on a post-Brexit regulations on the use of data, it has also raised the possibility of changes on how the public sector can approach the issue – hopefully providing more clarity in the existing grey areas.
The official announcement focused on reducing burdens, economic growth, innovation and strengthening public trust, with warnings of tougher penalties for nuisance callers and – in an implied criticism of the EU General Data Protection Regulation (GDPR) – a future regime “based on common sense, not box ticking”.
Digital Secretary Oliver Dowden banged the drum for the proposals, saying: “Now that we have left the EU, we have the freedom to create a new world leading data regime that unleashes the power of data across the economy and society.
“These reforms will keep people’s data safe and secure, while ushering in a new golden age of growth and innovation right across the UK, as we build back better from the pandemic.”
Clarity and confidence
The announcement’s relevance to public services was largely in references to what more can be done to mitigate potential bias in algorithms. But the consultation document itself contains a chapter on the regulations around using personal data in public services, with a series of recommendations to provide more clarity and confidence in data sharing.
One derives from the experience of the Covid-19 pandemic, in which data sharing between public authorities did a lot to support people most badly affected by the lockdown, but which at times forced data controllers and policy makers into spending a lot of time on making sure that new processes were within the law.
There have also been issues when private companies are processing personal data to deliver public service tasks.
In response, the document proposes a clarification that companies, organisations and individuals asked to process personal data for a public authority can rely on that body’s lawful ground for doing so under the UK GDPR, and need not identify a separate lawful ground.
This is related to the issue of processing health data in an emergency, which has required a controller to identify a ground under Article 9 of UK GDPR. This has sometimes been difficult during the pandemic as it can be difficult for non-healthcare bodies to meet the criteria.
Subsequently, there is a proposal to clarify that public and private bodies may lawfully process health data when necessary for reasons of substantial public interest in relation to public health or other emergencies.
Sensitivities and challenges
Another awkward issue is in processing sensitive personal data – such as on health, race, ethnicity or sexual orientation and including genetic and biometric data – that generally needs explicit consent from the subject, and is subject to provisions under UK GDPR and the Data Protection Act (DPA).
It comes with two key challenges, one in finding the right balance between the provisions being sufficiently flexible to allow the processing, and ensuring they are specific enough for transparency and to give the data controller certainty that they are within the law. In response, the Government is considering whether to add new situations to those in the schedule or amend existing ones to provide more specificity.
The second is to ensure each provision has the safeguards to prevent misuse, also while providing transparency and certainty. There are rules relating some categories of sensitive data to a need for ‘substantial public interest’ before processing, but there have been complaints these are not sufficiently defined and there is no case law to support their interpretation.
This raises two possible courses of action: to include a definition of ‘substantial public interest’ in the legislation; or to add or amend the list of specific situations in Schedule 1 of the DPA always deemed to be in the public interest.
Another controversial area has been the use of biometric data by police forces, and while there is a legal framework in place the document says it is complex for both the police and public to understand. This is prompting the Government to aim at streamlining and clarifying the relevant rules to make them more transparent and flexible.
Safety and security
On the issue of processing data for public safety and national security, it points towards clarifying the legislation to improve cross-sector working, particularly in the joint operational activity between law enforcement and national security bodies.
The issue of trust in the use of algorithms is addressed by an intention to introduce compulsory transparency reporting for when they are used in decision making, not just by departments and public authorities, but government contractors using public data.
There is also a proposal for an extension of the Digital Economy Act, which since 2017 has provided umbrella legislation to enable public authorities to share personal data for specific purposes, so data can be used to help improve outcomes for businesses as well as households.
Operational focus
Overall, there seems to be little that will be controversial. The proposals are focused on dealing with questions in the details that may come up in a minority of cases but often take a lot of time to resolve, and directed towards operational matters rather than changing the balance of ethical judgements.
They are unlikely to resolve everything permanently: the legal landscape for using personal data is always going to change as technology opens up new ways of doing things and public perceptions change.
But they could well provide a framework to reduce the friction in some processes and make a difference to cases in which the agonising over regulations can get in the way of using data in a way that supports people in need and raises the overall standards of services.
There is a good chance that much of this will go through pretty much as laid out in the consultation document.
Image from iStock, Marchmeena 29