EU Data Protection Update Q4 / 2022

Data Cloud over World

The last quarter of 2022 saw no shortage of interesting developments in data protection and, in particular, data security. There were lots of activity in the industry as well as on the legislative and administrative side. The EU in particular reached a political agreement with the US regarding data transfers and started to move towards adoption of this "Privacy Shield 2.0". At the same time, tech giants Microsoft and Meta made serious attempts to fix some obvious holes in their obligations to guarantee proper privacy of their user's data while Apple seriously tarnished its positive image in this regard.

And last not least, the quarter saw a flurry of enforcement activity. A key takeaway of the decisions discussed in this overview is for sure that regulators put more emphasis on Transfer Impact Assessments (TIA).

Twitter's Privacy and Data Security Is in Question after Musk's Takeover

The without a doubt most entertaining privacy-related events happened at Twitter. In a disrupting move, Elon Musk, not known for subtleties, bought the chattering classes' favored social network for 44 billion USD. That move was followed by an immediate and, reportedly, chaotic layoff of more than half of the company's staff.

In the aftermath of those events, Twitter's Chief Information Security Officer, Chief Privacy Officer, and Chief Compliance Officer resigned from their respective posts. In reaction to that, Twitter established an "acting" Data Protection Officer (DPO) as required under EU law and informed its European supervisor, Ireland's Data Protection Commission, accordingly.

Some supervisors, namely the US Federal Trade Commission (FTC), voiced concerns regarding Twitter's ability to ensure compliance with data protection and security legislation. Time will tell how justified this was.

Global Data Breaches continue the Upward Trend

A study carried out by Surfshark, a cyber security company, found data breaches having risen dramatically in 2022 – 70% from the second to the third quarter and further rising in Q4. A staggering number of almost 110 million accounts were breached worldwide. While Russia has the most breached accounts overall, Indonesia has the steepest growth in breaches. The US also has a high number of such incidents. The fact that Belarus, too, reported a steep increase may point to the war in Ukraine acting as a driver in this regard: our data protection overview for the Q2/2022 already discussed this possibility.

Abusive Wave of Cease-and-Desist Letter re: Google Fonts

Germany was hit by a tidal wave – tens of thousands – of cease-and-desists letters that demanded damages for the use of Google Fonts on websites.

The basic idea is that, when dynamically linked, Google fonts are not stored locally but instead loaded from Google servers. While doing so, personal data of the website users (including their IP address) is automatically transmitted to Google. This way, the user no longer has any control over the processing of their data, which is an unacceptable violation of the general right to privacy.

The cease-and-desist business gained traction after the Munich district court ruled that the dynamic use of Google-Fonts on websites indeed violated data protection compliance and violated personal rights of the users of such sites. Resourceful profiteers turned that precedence into a business opportunity by scanning websites in large numbers to find out about those that use Google fonts. That good business idea may, however, have led to the downfall of the allegedly damaged parties: prosecutors and police seem to think that a bot cannot be violated in their personal rights as they have none. Thus, the police raided offices and homes to secure evidence and funds.

Legislative Action

New EU Standard Contractual Clauses finally in Effect

On 27.12.2022, the new Standard Contractual Clauses (SCC) finally got into effect, even for such data transfers that had grandfathered in the old version of the SCC.

French data protection authority CNIL issued a reminder that the previous version of the Standard Contractual Clauses can no longer be used to regulate data transfers to third countries outside the EU / EAA. The transfer tool needs to be updated lest such data transfers become, in fact, illegal.

It is worth noting again, that according to the "Schrems II" judgment of the European Court of Justice (ECJ), even when the new version of the SCC is being used, an additional Transfer Impact Assessment (TIA) is needed to find out which additional measures have to be considered to guarantee data security and privacy. This is true in particular for data transfers to the US.

EU-US Privacy Shield 2.0 draft

The EU Commission released the draft of its adequacy decision for data transfers to the US based on the EU-US Privacy Shield 2.0. A new transfer mechanism is directly needed as the Privacy Shield 1.0 (and its predecessor, the Safe Harbor Agreement) was declared invalid by the European Court of Justice (ECJ).

Reactions to the draft were mixed, as was expected. The Hamburg data protection authority saw a change for the better on the US side and asked to refrain from unfounded blanket criticism of the US legal situation. However, it made also clear that the practical implementation of the Privacy Shield is crucial, as this right now is "just" a political agreement.

At the same time, Max Schrems, who is one of the few persons having named not one but two ECJ rulings after him, as he successfully challenged EU-US privacy agreements, seems to prepare for Schrems III.

Ongoing discussion about planned EU AI-Act

Within the EU, there is an ongoing discussion about the planned Artificial Intelligence Act (AI Act) that is planned to be a framework for Artificial Intelligence (AI) in the European Union. It is meant to ensure the safe and ethical use of AI.

The AI Act's principal mechanism is to categorize AI systems by risk and stipulate duties accordingly. One key aspect is the protection of personal data. To this end, the act seeks to ensure that AI systems are developed in a manner that respects fundamental rights and freedoms, including the right to privacy and data protection. Furthermore, it establishes a set of rules, including requirements for transparency, accountability, and fairness. It will also require organizations to carry out impact assessments to identify any potential risks to fundamental rights and freedoms posed by the use of AI.

Digital Operational Resilience Act adopted by EU Council

The financial sector is very much dependent on information technology. At the same time, without dependable financial institutions, a complex economy like that of the EU cannot function properly. Therefore, it is of vital importance to ensure that the financial sector is properly equipped to handle cyber threats.

To that end, the EU Council adopted the Digital Operational Resilience Act (DORA) on 28.11.2022, which aims to bolster the resilience of the financial sector in Europe. The law addresses cyber threats and homogenizes regulatory requirements across the Union regarding digital operational resilience. All financial firms need to make sure they can withstand, respond to, and recover from all types of disruptions and threats in the digital domain. The relevant regulatory authorities will now begin to develop technical standards that all financial services providers will have to observe.

In particular, all sources of risk will have to be continuously monitored and preventive measures will have to be established. This will have to be accompanied by business continuity policies and disaster and recovery plans. Furthermore, companies will be required to establish mechanisms allowing them to learn not just from their own incidents, but also from events happening at third parties.

Digital Market Act Entered into Force in the EU

On 1.11.2022, the EU Digital Markets Act (DMA) entered into force. While primarily a law to end unfair practices by "gatekeepers" in the online platform economy (such as operating systems, browsers, search engines, cloud services, and social media platforms), the act also has data protection repercussions.

The main idea is that gatekeepers, digital platforms with a certain market power, can basically set their own rules. The DMA wants to ensure those private rules are in line with the EU's ideas and therefore establishes do's and don't's to be implement.

On the data protection and privacy side, the DMA prohibits gatekeepers, among other things, from

  • Combining personal data obtained from their subsidiaries.
  • Prohibiting the use of users' personal data from users using third-party services that, in turn, make use of gatekeepers' core platform services.
  • Preventing users from raising issues of non-compliance with EU or national law.

Industry Privacy Initiatives and Failures

Meta cracks down on "surveillance-for-hire activities"

In 2022, Meta, owner of Facebook, Instagram, WhatsApp, and other services, banned seven companies that were involved in surveillance-for-hire activities. The latter basically means spyware in the hands of private enterprises.

In a report on the matter, Meta asked for industry and democratic institutions to work together in tackling the problem. Meta also proposed concrete recommendations to that end, including

  • a ban of the sale of surveillance software,
  • establishing institutions to help victims seek legal recourse, and
  • using export control lists to limit the availability of surveillance technologies.

The problem seems to be a pressing one: Meta had notified users in 200 countries that they had been targeted by spyware. The victims are journalists, political opposition, and human rights activists, but increasingly also "normal" users.

Microsoft introducing "EU data boundary"

Microsoft started the introduction of its EU data boundary, allowing EU customers of its cloud services to process data in the region. The idea is to prevent data transfers to the US thus enabling users to stay GDPR-compliant.

We at Engity doubt that this initiative solves the underlying problem as the US Cloud Act allows US law enforcement and intelligence agencies to access data also on servers in the EU as long as the company owning those servers is based in the US. This is clearly the case for Microsoft. In fact, it was Microsoft that was the reason for establishing the US Cloud Act in the first place. On similar grounds, many data protection activists and legal scholars share our concerns: It will be very hard to square the circle and make the MS Cloud compliant with EU data protection law.

At this time, only customer data can be stored within the EU data boundary. Other types of data such as logging data and service data are set to follow in 2024.

Apple tracks personal data even when it says it would not

Apple cultivated an image of being the guardian of privacy and data protection. A safe harbor in a predatory world of privacy sharks. Yet it seems as if Apple treats itself more equal than anybody else.

A study revealed that the iPhone privacy settings that enable users to turn off tracking of personal data do not work vis-à-vis Apple itself. This is despite the clear wording of those settings as they promise to "disable the sharing of Device Analytics altogether." And this may well be true for third parties, but the study found no effect on Apple's own data collection. In fact, the study found "the level of detail (of data collection) shocking". Apple seems to collect data about even the slightest user interaction, regardless of any privacy setting the user did.

The report looked at the behavior of competitors such as Google and Microsoft. In their respective browsers, if analytics is turned off, data collection indeed seems to stop.

Administrative Data Protection Initiatives

ICO gives guidance on Transfer Risk Assessments

The U.K. Information Commissioner's Office (ICO) published helpful guidance on international data transfers, in particular information on transfer risk assessments (TRAs) and a TRA tool.

The ICO very clearly sees its tools as an alternative to similar approaches by the European Data Protection Board which can be rather complicated. In contrast to that, the ICO wants to offer a more reasonable and proportionate approach.

We at Engity understand data protection as a function enabling business, not stifling it, and therefore we welcome a hands-on approach. It remains to be seen to which extent this can be archived in practice by the tools provided by the ICO.

Dutch DPA on privacy vs. anti-money laundering

Autoriteit Persoonsgegevens, the Dutch data protection watchdog, showed that even very well-intentioned laws can seriously collide with privacy. In particular, a planned anti-money laundering law would, it its analysis, lead to mass surveillance of customers, carried out by banks.

If the law was passed, Dutch banks would have to collect and monitor the payment behavior of all Dutch people and store those – very sensitive – data in one single database. This would, of course, open the door to all kinds of security concerns. But it might also mean that once people are flagged in that central database as being an AML-risk, they might find it impossible to open an account with any bank at all.

The data protection watchdog points out the need to balance preventive measures with the fundamental rights of people. In particular, there should be no general surveillance concerning everybody at all times and without any specific reason. People should be treated as innocent until proven guilty.

DSK bans use of MS Office 365 for lack of compliance

The German DSK (in full: Conference of the Independent Data Protection Authorities of the Federation and the States) issued an opinion on the use of Microsoft Office 365 in schools and found such use to be impossible under current data protection legislation. This is mainly due to a lack of transparency of how personal data are being used, but also due to potential access of third parties, meaning US intelligence and law enforcement agencies.

While technically the opinion is only concerned with the use of Office 365 in schools, it is very hard to see how the same reasoning would not apply to other sectors as well.

DSK updates its Standard Data Protection Model to V 3.0

DSK has also published the updated V 3.0 of its Data Protection Model (SDM). The idea of the SDM is to translate the legal requirements of the GDPR into technical and organizational measures. It aims to transfer an abstract catalog or data protection aims in very real to-dos. A further aim is to create a standardized framework for authorities as well as businesses.

Version 3.0 compactifies and clarifies some of the SDM's tools making the concept easier to use.

Enforcement

CNPD fines National Institute of Statistics with 4.3 Mio EUR

Portugal's CNPD issued a fine of 4.3 Mio EUR to the National Statistics Institute (INE) for a series of serious data protection neglects. This includes the unlawful processing of sensitive personal data relating to health and religion, failure to properly select sub-processors, and infringement of data transfer rules. In particular, the CNPD found that data transfers to the USA took place without providing for any additional measures that prevent access to data by third country government entities. This, however, is necessary according to the case law of the European Court of Justice (namely the Schrems II Judgment).

It is remarkable in a positive way that data protection authorities do not only monitor private enterprises but exercise an offensive approach to privacy violations by public bodies as well.

Clubhouse fined 2 million Euro for failing pretty much ever privacy rule

Garante, Italy's data protection authority, fined 2 million Euro to Alpha Exploration, the owner of Clubhouse, an audio-centered social media service. In its investigation, Garante found numerous privacy violations ranging from lack of transparency to profiling and sharing of content and data without consent.

The authority not only handed out the fine but also gave out some homework to Clubhouse, namely to address all the failings and to carry out an impact assessment on the data processing.

Clubhouse has only 90.000 users in Italy. Other data protection regulators may also take similar action. The likelihood of this being the case may very well depend on how fast and how thoroughly Clubhouse fixes the data protection shortcomings.

Easylife Ltd fined 1.35 million GBP for targeting ads

Easylife Ltd, a British catalog retailer, was fined 1.35 million GBP by ICO, the UK's independent regulator for data protection and information rights law.

The data watchdog's investigation found that Easylife would profile their customers and make assumptions about their health status, targeting them with respective products afterwards. Easylife would, for example, assume that a person who bought a certain jar opener might have arthritis and offer them related products. This all would be fine and could even be seen as a service, had it been done with consent. Instead, it was invisible and highly sensitive data were collected.

Discord fined 800,000 Euros for lack of data security

The French DNIL fined Discord Inc. 800,000 Euros over lack of compliance with the GDPR. Among other issues, Discord failed to define data retention periods, did not carry out a data protection impact assessment, and did not ensure the security of personal data. The latter failure is rather embarrassing for a technology company: Discord let users walk away with weak passwords.

Despite the numerous compliance failings and the large number of customers concerned, the fine is rather mild. CNIL very explicitly makes the point that this is due to Discord's cooperation and serious attempt to address the problems that came to light.

The lesson to be learnt for companies is that it pays to take the regulators seriously and to act constructively. At the same time the lesson for regulators – well applied by CNIL – is to reward such a cooperative approach by the industry. This results in a triple win for corporation, regulators, and users.