When Europe’s General Data Protection Regulation (GDPR) became effective in May 2018, it established a foundation for a new generation of data privacy laws that afford greater protection to consumers. Core principles such as unambiguous consent, data minimization, limitation of purpose, and the right to object had effectively written established data best practices into law.
Since then, GDPR-style privacy legislation has been adopted around the world. California’s CCPA set the ball rolling in the US, with many other states following (Colorado, Connecticut, Utah, and Virginia) or in the process of following (Michigan, New Jersey, Ohio, and Pennsylvania). Around the world, we’ve also seen the introduction of LGPD in Brazil and PIPL in China, to name just two.
A challenge now faced by data controllers and data processors is ambiguity. That is, what do the key clauses in these new pieces of legislation actually mean? Often, they need to be tested in courts of law to clarify their true intention and establish legal precedent. This is now happening in Europe, and practitioners elsewhere can learn from these cases and apply the findings before they fall afoul of them in their own countries.
European regulators have definitely been baring their teeth in 2022.
Clearview AI, a facial recognition firm, has been fined €20m by Italy’s data protection agency and a further €9m by the UK’s Information Commissioner’s Office (ICO) for illegal processing of biometric and geolocation personal data.
The Irish regulator imposed a €17m on Meta (Facebook) for a failure to have appropriate technical and organizational measures in place.
In Spain, Google was fined €10m for forcing users to accept the transfer of content removal requests to a third party.
Most recently, as a result of failing to protect children’s privacy while using the platform, TikTok could face a £27m fine following a potential breach of UK data protection laws.
A common theme running through these cases are the core principles of “lawfulness, fairness, and transparency,” meaning businesses must be clear with individuals about how their personal data will be processed, and that an appropriate legal basis has been established for doing so.
In the United Kingdom, enforcement action in 2022 has focused largely on unauthorized sending of marketing messages. New data privacy laws like GDPR require a legal basis—typically consent or legitimate interest—for the processing of personal data, which includes marketing activity.
Recent cases* show this requirement is still not clearly understood (or is willfully ignored!):
An important theme running through all these cases (and others) is they were originally brought to light by consumer complaints. Consumers now have a greater understanding of their data privacy rights and are prepared to exercise these rights if they believe their personal data is being misused.
When handling consumer data, it’s important to remember:
Following a migration to a new CRM system, Reed Online inadvertently scheduled marketing emails to customers who had previously been unsubscribed/suppressed.
Tuckers Solicitors experienced a ransomware attack, resulting in a personal data breach. The ICO ruled that the company’s failure to implement appropriate technical and organizational measures had made them vulnerable to attack.
The UK government’s Cabinet Office disclosed postal addresses of the 2020 New Year Honours recipients online—a failure to prevent unauthorized disclosure of people’s information.
While the high-profile breaches make the headlines, many incidents are far more mundane.
The ICO publishes a quarterly data security report, with the most recent “non-cyber” (i.e., self-inflicted) issues including:
These trends point largely to human error and/or inadequate training, and present a compelling argument in favor of implementing “privacy by design” practices where robust processes minimize opportunities for non-compliance.
We’re still not really seeing the “four percent of global revenue” fines that can theoretically be levied, although it’s not to say this won’t happen. The British Airways (BA) fine—as proposed—came close before being reduced for a range of mitigating factors, including the impact of the Covid-19 crisis on BA’s finances. While no business wants to deal with a privacy breach, there are mitigating factors that will be considered if it happens, including:
Regulators will generally be more lenient with businesses that are transparent about what went wrong, are cooperative in assisting the investigation, and move quickly to put measures in place that will prevent a re-occurrence.
There’s so much more to be said on this topic. Want to learn more about data privacy legislation around the world? Check out our Guide to Global Privacy Laws and Compliance.