Skip to main content Skip to footer

Data protection: are UK data laws changing again?

Data Protection

Data protection laws in the UK might be facing changes once more, including GDPR being thrown out after only three years. What’s motivating this review, and why may they not be in the best interests of consumers?

The UK Government is currently in the midst of a 10-week consultation on proposed changes to its data protection laws (part of the wider National Data Strategy), aiming to create “an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.”

This consultation is the first step in delivering on the new strategy’s second mission to secure a “pro-growth and trusted data regime” that could see the enhanced levels of data sharing during the COVID-19 pandemic become the real “new normal.”

In their most extreme form, these new rules could see GDPR rolled back, a mere three years after it was enshrined in UK law, in a massive departure from the rules of the EU that puts forward data privacy as a fundamental human right.

In August, Oliver Dowden, then-Secretary of State for Digital, Culture, Media & Sport (DCMS) said that UK data laws must be “based on common sense, not box-ticking.”

The Government also claims that the new data regime would simplify “data use by researchers and developers of AI and other cutting-edge technologies” to “cement the UK position as a science and tech superpower.”

Though the promise of fewer “irritating cookie pop-ups” and “tougher penalties and fines for nuisance calls and text messages” may appeal to those suffering from “consent fatigue,” this deregulatory bonfire may instead backfire, opening up users’ data for commercial uses, with little regard for privacy violations.

The plan is to excise a great deal of the administrative red tape from current GDPR laws in favour of a more flexible process, particularly for SMEs beholden to the same rules as larger firms who are better equipped to deal with the admin.

Rules about record-keeping and reporting would be replaced with more lax requirements, such as no longer needing to carry out Data Protection Impact Assessments (DPIAs) or consult the Information Commissioner’s Office (ICO) regarding high-risk personal data processing.

Proposals also seek to tackle “over-reporting” of data breaches by increasing the threshold for incidents which businesses must notify the ICO – provided the risk to individuals is “not material.”

How businesses use data would also change, with firms permitted to make use of any personal data they may hold for research purposes without giving subjects any prior notice.

Prohibitions on the use of AI decision-making for judgments with significant effects, without human oversight, could also be removed, potentially allowing, for example, AI to determine whether an applicant is eligible for insurance or not.

The new rules may also throw open the door for transfers of data to other countries with lower data protection standards; such transfers were ruled as illegal under GDPR if the data laws of the third country in question were not up to scratch, as per a Court of Justice of the European Union ruling dubbed “Schrems II.”

COVID has further helped to shine a light on the current state of data privacy in the UK, and the sluggishness that’s keeping legislation several steps behind the tech; England's notorious Test and Trace scheme was determined to be breaking GDPR law, and contact tracers were found sharing confidential personal data of those with suspected COVID cases over WhatsApp.

Even supposedly above-board deals have come under scrutiny, including the many contracts between the NHS and private companies such as iProov, whose automated facial verification software is used to verify the identity of the 16 million people signing up for the NHS app.

Meanwhile, data collection through NHS Digital’s proposed General Practice Data for Planning and Research (GPDPR) scheme, making confidential patient data available to private companies, was met with a frosty reception, with over one million people opting out of the plans.

This has just continued a trend that started before the pandemic, with increasing numbers of private organisations being granted access to patient data – from Amazon, to the controversial US-based big data analytics firm Palantir, whose work in the US tracking undocumented immigrants has raised fears in some as to just what the company may do with the data they’ve been entrusted.

Three years after it came into force, GDPR’s impact remains limited, mainly due to poor enforcement across the EU’s member states, and insufficient resources at the disposal of some authorities; many of the largest tech companies have established their continental headquarters in Ireland, leaving the country’s Office of the Information Commissioner (OIC) scrambling to keep up with investigations.

Rolling back consumer protections under the guise of slashing red tape will likely be welcomed by some, especially given the potential savings of £1.45 billion over ten years, according to the government’s own research. However, these changes could go too far, diluting data subjects’ rights to a worse state than pre-GDPR, while threatening the free flow of data between the UK and the rest of the continent.

Giving official bodies the means to properly enforce data laws and increasing trust in the process would likely do more good for consumers and businesses than moving the goalposts to favour large, data-hungry corporations with questionable track records on private data. Any reforms to data laws should benefit consumers first, not large corporations.

This public consultation on reforms to the UK’s data protection regime closes at 11:45pm on 19 November 2021.

About the author

Adam Hughes

Cerillion

Keep up with our latest news Subscribe to our newsletter today