Data cloud in orange teal to symbolize data protection and cybersecurity.

EU Data Protection Update Q3-2023

Quarterly news update (Q3/2023) highlighting the major developments in the field of data privacy legislation, enforcement, administrative initiatives, cybersecurity with a special focus on Software as a Services and cloud services in general.

The data protection landscape has never been boring in recent years. Yet the third quarter of 2023 was a particularly interesting one as some clear themes emerged more visibly than in previous quarters. Some of those developments we shall look at in this report.

The big threads seem to be four:

  1. The EU-US Data Privacy Framework (TADPF), while in force, remains a mess. It seems almost impossible to bridge the gap in understanding of privacy between the EU and the US. The TADPF offers some patch, a kind of band aid, but if that will be enough remains to be seen. We at Engity are mildly skeptical.
  2. Consumers and regulators – on both sides of the Atlantic – distrust the tech behemoths in social media and AI more and more. Too many unethical or outright illegal business practices have emerged, and it seems that if one such behavior is being stopped, three appear elsewhere, just like the heads of Hydra.
  3. Data Protection originally emerged out of a distrust of the government. While big tech seems to have taken over the role of the whipping boy, the government still has the monopoly on the use of force – at least in the developed world. It is therefore necessary to also scrutinize data handling in the public sector, in particular in law enforcement and police. There has been a string of interesting cases in Q3.
  4. Cyberattacks have become an integral part of politics, discourse, and warfare. The many conflicts around the world show how dangerous the cyber domain has become.

The EU-US Data Privacy Framework Remains a Swamp

After the failure of the Safe Harbor and the Privacy Shield and the resulting uncertainty regarding EU-US data flows, a new transfer tool has been found in the form of the Transatlantic Data Protection Framework. The idea is to re-connect two very large trade blocks also digitally to further economic activity.
Yet the task is gargantuan and the principal understanding of privacy and data protection on both sides of the Atlantic seems to be a fundamentally different one. The question, whether or not the TADPF manages to bridge that gap remains astonishingly open.

Adequacy Decision following the TADPF adopted

Both, the EU and the US congratulated themselves and each other on the finalization of the mechanism when the EU adopted the respective adequacy decision on July 10, 2023. This should allow for a free data flow between the trade blocks, basically threating the US "as if" it was part of the EU in terms of data protection.

MEP of the EU parliament challenges EU-US-DPF in court

Not everybody agrees that the TADPF can deliver on its promises. For example, the Parliament of the European Union, which asked the EU Commission to reject the adequacy decision as it was felt that there simply is no "equivalence" between the data processing regimes here and there and the legal redresses that EU data subjects may have in the US are wildly insufficient.

Little wonder therefore that the first legal challenge to the EU-US Data Privacy Framework was filed with the European Union's General Court by a MEP, asking to suspend the TADPF.

The move was immediately supported by members of the German Bundestag (the Vaterland's parliament) who do not see any substantial progress of the TADPF over the Privacy Shield, its predecessor.

Not everybody is on board with this gloomy assessment tough. Many legal scholars, especially in the US, do see a substantial alignment of European and American thinking and application of principles in privacy. While there is agreement that both jurisdictions are not the same, the idea is that they are "holistically" on the same level – adequate, with other words.

We at Engity have some reservations and tend to agree with the European Parliament in finding the TADPF lacking.

Businesses stick with SCC over the TADPF

With all those developments it is unclear, if the TADPF can withstand scrutiny in court or will be as short-lived as its predecessors. This uncertainty results in a cautious approach by many businesses to use the framework as a transfer tool, as US media report. Many companies that used the EU Standard Contractual Clauses (SCC, an alternative way to transfer data that is based on a contract between data exporter and importer) seem to stick with that tried-and-tested mechanism. At least for the moment.

Legislative Action - Enforcement of Digital Services Act Starts

The EU Digital Services Act (DSA), upon which we reported extensively, came into its enforcement period on 25. August 2023. While the act is still not in full effect, that stage will only begin in February 2024, it now requires large data processors to provide risk assessments to the EU commission and imposes stricter rules on such organizations. Affected in detail are, in the DSA lingo: "VLOPs and VLOSEs – very large online platforms and very large online search engines."

Administrative Data Protection Initiatives

Data protection authorities continued to issue guides and other work tools to further a better understanding of data protection and privacy in the third quarter.

Helpful Data Protection Impact Assessment Guide by Swiss DPA

The Eidgenössische Datenschutz- und Öffentlichkeitsbeauftragte (EDÖB) published a helpful guide on how to conduct a Data Protection Impact Assessment.

Such an assessment is required if an intended data processing activity involves a high risk to the rights of the data subjects concerned.

The guide offers help on how to structure the impact assessment formally and which criteria to apply. To make things easier, a template is offered (in German).

Guidance on employee monitoring by Norway's DPA

Datatilsynet, Norway's data protection authority, offers guidance on the monitoring of employees' use of electronic equipment – an evergreen in the practice of many DPO's.

While employers do have very good reasons to wish to monitor such use, in particular for reasons of cybersecurity, employees do need to know how and to which extend they are being surveilled. This is especially true as today's work tools record a large amount of behavioral data.

While the guide is targeted towards Norway, where employers' rights to monitor employees are very restricted, the considerations offered can be applied to other jurisdictions as well, if only in the manner of "if this is legal in restrictive Norway, it will be legal here as well".

CNIL recommends on best practices for codes of conduct

Frances CNIL published a document detailing eight best practices for codes of conduct – a set of rules outlining the proper practices of an organization, including its management and employees.

That, of course, integrates data compliance as well.

CNIL offers indeed some insights that are often neglected. For example, it stresses the point that a code of conduct must not only exist but also be governed and enforced. This may sound trivial but is, as experience shows, not so in practice.

Codes of conduct become ever more important as more and more large businesses demand their suppliers have such codes – and adhere to them.

Enforcements

Dark Patterns under scrutiny

Dark Patterns, practices that nudge or trick users into giving consent or revealing personal information, have been a nuisance – or threat – for some time. DPAs and regulators increasingly try to curb such bad behavior. This is true for the US, as well as the UK and the EU.

The GDPR demands that consent must be unambiguous, informed, specific, and given freely. All of which dark patterns sabotage. And such dark patterns are not only used by malicious players, but also, for example, by Amazon when asking for consent to sign up for Amazon Prime (as the American FTC finds in a law suit it filed against Amazon).

If even the big players use such shady tactics, it is high time for regulators to act indeed.

Sweden's DPA hand out fines and injunction against use of Google Analytics

IMY, the Swedish Authority for Privacy Protection, ordered three businesses to stop using Google Analytics and also handed out administrative fines of 12 million SEK and two times 300.000 SEK respectively.

The interesting aspect of the case is that the data were transferred to the US using the EU Standard Contractual Clauses (SCC) as a transfer tool. Those clauses, however, are not sufficient per se. They may need to be amended by additional safeguards if necessary. Such additional safeguards were lacking.

IMY thinks that its decisions could – and should – be applicable to more organizations. A good reason to investigate this matter for anybody using Google Analytics.

A Special Look into Data Protection in Law Enforcement

The third quarter of 2023 saw a flurry of activity of courts, data protection authorities, and other players centered around the data processing and cybersecurity in police and law enforcement. We, the people, citizens, and data subjects are well advised to follow those developments as the police is the main way in which the government exercises its monopoly on force. A transparent and compliant use of our data is therefore necessary.

ECJ on data separation in law enforcement

The Court of Justice of the European Union (ECJ) looked into certain types of use of personal data in law enforcement. In particular, the court precluded the use of data that were collected for the purpose of combating high-level crime in the context of a corruption case.

The underlying idea here is that of separation of data: data that were collected for one purpose should not be used for a different purpose. This is not only true for business but for governments and law enforcement as well.

This of course prevents the collection of data under the pretext of fighting serious crime and then using every last bit of the same data as by-catch in other contexts. Such bad habits can be seen all too often as they are obviously tempting the enforcement agencies.

A very welcome and necessary decision.

Czech DPA looks into facial recognition by the police

Úřad pro ochranu osobních údajů, the Czeck DPA, is investigating the use of facial recognition technology by law enforcement. The police have been using face scans for more that a year already, but data protection advocates say that a proper legal authorization is lacking.

For that reason, the DPA has requested technical documents and explanations to access the legality of the undertaking. This includes information on how pictures of suspects were collected and how they were mapped to IDs.

The result of the investigation may be to stop the police's activity – but it could also be an amendment to existing laws to give better legal grounds for such use of data.

EDPS issues recommendation for data protection at Europol

The European Data Protection Supervisor (EDPS) has completed its audit of Europol. In its report published on 6 September 2023, it found that the overall level of data protection compliance to be satisfactory. Nevertheless, it gave ten recommendations on how to improve, six of which are critical.

Data Breach at Manchester Police investigated by ICO

On the flip side of the coin, the police do not only process data of suspected criminals but can itself be affected by data leaks and breaches. Such thing happened to the Manchester Police force, where a ransomware attack comprised names – and pictures – of police officers.

Surely no information that should be on sale in the dark web.

Police of Norther Irelands leaks… itself

On a much grander scale and more devastating, the police force of Northern Ireland managed to leak names and locations of 10.000 police officers in responding to a request under the Freedom of Information Act.

This was, of course, a mistake – and a potentially fatal one as Northern Ireland's police has frequently been the target of terrorist or paramilitary action. Officers, therefore, are concerned about their – and their families – security.

Observers call this the worst data breach in the history of Northern Ireland's police force.

Social Media, AI, and Big Tech

The implications of the rise of social media, Artificial Intelligence, and Big Tech in general on privacy issues seems to be ever growing. The same is true for the awareness of this in both, the general public and politics. The process of finding a good set of governance rules and oversight is ongoing but messy. At the same time, technical developments seem to move so fast that public discourse can barely catch up.

Elon Musk's X or Twitter may collect biometric data

Elons Musk's social media platform (and soon to be everything-app) X, formerly known as Twitter, updated its privacy policy and may henceforth collect and process more data than so far. This could include biometric data, government ID's, employment and education history.

The reason given by X as to why they want to do so is the recommendation of employment opportunities and to verify premium accounts.

The idea may not earn too much love tough by either regulators or consumers, both of which seem to see Mr. Musk as the whipping boy for everything that is wrong with social media.

Appeal against ban on behavioral advertising against Meta in Norway falls flat

The district court of Oslo, Norway, upheld an injunction by the Norwegian DPA, Datatilsynet, that ordered Meta, formerly known as Facebook, to stop behavioral marketing.

The main focus of the decision was on procedural matters. Meta argued that Datatilsynet had no proper authority to issue an injunction and that Meta was not properly heard in the process.

The proceeding also is a slap in the face of the Irish Data Protection Commission that sat on its hands, failing to ensure Meta's compliance with the GDPR.

The injunction itself, which came into force on 4 August 2023, does not affect users who gave valid consent to being target of behavioral advertising. Nevertheless, Datatilsynet considers such practices to be one of the biggest risks to privacy.

Privacy Researcher complains GDPR violations by OpenAI

Poland's DPA, Urząd Ochrony Danych Osobowych, investigates data processing practices of OpenAI, maker of AI-tool ChatGPT. This follows a complaint filed by a Polish privacy researcher. Main concerns are the violation of several principles of the GDPR: There may be no legal basis for the processing of personal data, there is no privacy by design and, of course, the processing activity is rather nontransparent.

Lukasz Olejnik, the privacy researcher, asked ChatGPT about himself, got wrong answers, and took the obvious step to ask OpenAI to correct this. Which, he was told, could not been done.

The probe has to be seen in the context of how Large Language Models ("LLM") work. They get their training data often from the wide and free internet – which is, as we know, rife with misinformation to begin with. The LLMs then work on these data in ways that not even the creators of the models fully understand. This makes the LLMs prone to hallucinations: They spit out information that are incorrect, and the models themselves do not seem to know.

LLMs, it seems, cannot be trusted, and neither their creators, when it comes to compliance.

Fitbit's Fitness Tracker violates GDPR, says NOYB

Google, among many other things, also owns Fitbit, maker of fitness trackers and owner of the respective infrastructure. Max Schrems' NYOB thinks that Fitbit get user consent to transfer data to the US illegally. Users are, the argument goes, forced to consent to the export of their data and cannot withdraw it without deleting their Fitbit account completely.

Considering the fact that health and fitness data are among the most personal and sensitive data a person can share, looking into the data processing practices may be a splendid idea.

Google's YouTube might have facilitated tracking of children

Research of Adalytics seems to show that it is possible to track minors using third party ads on Google's YouTube. This happens when targeted ads are shown to customers that, when clicked, allows for tracking by the business using the respective ad. All the infrastructure (targeting software, users, tracking system) is supplied by Google.

The case at hand – a Canadian bank looking for (adult) customers – looks rather like a mistake: Children were shown the ads for no obvious reason. Yet it is easy to see how such technology could be used by malicious actors as well.

Warning on data collection and scraping on Social Media from: Everybody

DPAs of twelve countries, including the US and Switzerland, issued a document outlining their expectations of social media platforms regarding safeguarding against the scraping of personal data by third parties.

The idea is that social media is a nexus of a lot of personal data of its users. Those platforms are increasingly being scraped (data are automatically extracted using bots) by other actors that collect the data and create profiles of users for all kinds of reasons, some of which may be nefarious – think of spam, targeted cyberattacks, or identity theft.

The platforms should do more to safeguard the personal data they are being trusted with.

One key point in the statement is of very special interest: The DPAs think that data scraping activities could constitute a data breach that may have to be reported to the regulator. This is in our opinion a very interesting angle to enforce better privacy compliance.

Biometric capabilities of Tech companies are beyond what was known

On the subject of surveillance capitalism: It seems that Social Media giants Google and Meta possess face recognition capabilities beyond what was previously known. The catch: using that technology, not just them but everybody could identify everybody everywhere with a phone or wearable glasses, making facial recognition ubiquitous.

The tech seems to have spooked even the tech giants, so they decided to stop further development. In a world where data scraping is easy (see previous story) and AI accessible to everyone, this will not hinder the availability of such tools for very long.

On such matters, the UN High Commissioner for Human Rights, Volker Türk, warned that such tech in the hands of everybody would spell the end of privacy as we know it. He calls for better regulation and international cooperation.

To end this section of our Q3 2023 data protection overview on a positive note and to show that administrative action can have a positive impact: CNIL, the French privacy watchdog, closed an injunction against Google from 2021 because the company finally complied.

Root cause of the proceeding were Google's methods of collecting user consent for the processing of personal data which CNIL found to be lacking in freedom. Finally, Google added a "reject all" cookies button on its search engine and on YouTube.

But only for French users.

Dutch Privacy watchdog looks into Generative AI data practices

On a similar note, Autoriteit Persoonsgegevens, the Dutch DPA, probes into concerns regarding data processing in connection with generative AI. In particular, the focus is on apps targeting children.

Using the apps, the children can chat with AI bots, disclosing personal data. Children may have a very hard time to even understand whether they are talking to a person or a machine, and additionally not be aware of privacy consequences.

Data Protection and CyberSec in Global Politics and War

Last but surely not least, data protection and cybersecurity played again a large role in the context of ongoing conflicts and wars. Conflicts these days are played out not just on a battlefield or in political discussions, but also in the cyber domain. Be it as information warfare or in the form of hacking, gaining intelligence, and sabotage of critical infrastructure and systems.

A multitude of hot conflicts and rising global tensions seem to fuel this process even further, blurring the distinction between activism, vandalism, terrorism, and outright war.

Russia tracks antiwar activists

In its effort to curb any protests against its war of aggression in Ukraine, Russia is electronically tracking protesters and activists, using a new set of surveillance tools. This includes relatively simple things like tracking targets' phones and analyzing their online activity. It extends to more sophisticated measures though, such as breaking encrypted services like WhatsApp or Signal or at least monitoring the use of such tools.

This obviously alarms the makers of such tools and activists alike. People wishing to evade state surveillance will have to find new best practices on how to use their communication channels – or switch to completely new forms of talking to each other and organizing themselves.

Chinese hackers steal government emails in US, EU

Unlike the Russian activity, which was targeted inwards, Chinese state-backed hackers took action against targets located outside the country. By forging access credentials, they gained access to US and European government email accounts, stealing a wide variety of communication.

According to spokespeople of the affected agencies, the attacks were clever, targeting not some unsuspecting victim via phishing, but breaking the technical measures of the IT-systems themselves. The sophistication of the attacks seems to give the affected governments and agencies some headaches. What's more: the hackers managed to get very targeted access to specifically chosen high-value targets. In other words: They knew very well what they were doing not only on a technical level, but also had a very good understanding of what they were looking for and where to look for it.

In a press release, Microsoft called the approach used "surgical" and used some wording that could be interpreted as almost admiring. The company recommended that technical data protection measures should always be combined with organizational measures, such as periodical reviews of abnormal activity, the rotation of access credentials, and the implementation of strong privileged access security controls.

That, of course sounds a lot like the TOM concept of the GDPR. Which shows once more: data protection and cybersecurity are two sides of the same coin.

Epilogue: Data Protection and Cybersecurity are the Same Thing

To end on a positive note: The importance of data protection and of privacy (as not just an end in itself but an important tool in business and government functions) becomes clearer to many actors.

Wojciech Wiewiórowski, European Data Protection Supervisor, wrote in his blog that data protection and cybersecurity are more or less the same thing, the two sides of a coin as discussed in the preceding section.

We at Engity have always held this view: good data compliance also protects IP, company secrets, reputation, and more often than not the mere existence of a business. For governments and official bodies, data protection measures can have the side effect of making them less vulnerable to state sponsored hacking attacks like the ones we discuss in this report.

In a time where data breaches and ransomware attacks (ransom payments have doubled in the first half of 2023) are on the rise in both, number and cost, this simple truth needs to be heard more.