EU Data Protection Update Q1-2022

Hammer symbolizing European Courts and Data protection authorities coming down on yellow stars symbolizing the EU.

Laws, regulation, and data protection practices change faster than ever before. In the first quarter of 2022, many of the developments in the field concerned data transfers to so called third countries. These are countries where the European General Data Protection Regulation (GDPR) is not applicable and regarding which the EU commission has not issued an adequacy decision. Such transfers need legal grounds in order to be permitted. There have been many new developments in this area.

There was of course also a flurry of legislation and administrative activity of data protection supervisors.

New Standard Contractual Clauses gain traction

According to the GDPR, certain contractual clauses may serve as a means to make data transfers to third countries possible by providing data protection safeguards. This includes first and foremost model contract clauses, the so-called standard contractual clauses (SCC). These are a set of clauses that have been pre-approved by the European Commission. On 4 June 2021, the Commission issued a modernized version of these standard contractual clauses.

All new contracts must use the new SCC already, but the deadline to update existing contracts is 28.12.2022. While this seems to be plenty of time, many businesses have already begun the process of switching to the new clauses.

The process of adapting the new SCC requires more than "just" updating some paperwork, especially in light of the "Schrems II" judgement of the European Court of Justice (ECJ). Instead, an assessment of the laws of the transfer country must be carried out and documented: a Transfer Impact Assessment (TIA). The data exporter needs to make sure to be able to comply to its own obligations under the GDPR despite of the export.

For enterprises with many transfers into different jurisdictions, this can be an arduous task. The process of adoption of the new SCC therefore may also serve as a good opportunity to assess the feasibility of using tools and service providers in the EU or in countries that have a similar level of data protection to be on the safe side.

USA and EU agree on Privacy Shield 2.0 – in principle

It is hard to deny that in many data driven technologies US companies take the lead and offer valuable tools that can be conveniently used. It was therefore advantageous to have a mechanism in the form of the so-called Privacy Shield that made data transfers from the EU to the USA quite streamlined. This was a set of assurances from the U.S. government and an adequacy decision from the EU Commission based on them. The US was thus treated as a safe third country under the GDPR.

The Agreement was criticized from the start, as it did little to contain access of US intelligence agencies to the data transferred and did not offer data subjects a reasonable way to exercise their rights. In its landmark ruling of 16.7.2020, the European Court of Justice (ECJ) declared the EU-US Privacy Shield invalid (so called "Schrems II" ruling Ref.: C-311/18).

The lack of a convenient way to transfer data to processing partners or service providers in the USA was painfully felt though. Therefore, on 25.3.2022, the European Commission and the US announced an agreement in principle to replace the repealed Privacy Shield with a new framework, currently officially named the Trans-Atlantic Data Privacy Framework. It is not surprising though that it is being referred to commonly as the "Privacy Shield 2.0". The new framework is meant to take into account the concerns raised by the ECJ in its ruling regarding the previous Privacy Shield.

While there is strong business and political support for the Privacy Shield 2.0, many questions remain unanswered in detail and data protection groups have already voiced concerns about the defensibility of the new framework in court and pointed out that it may well fall short of the requirements set forth by the ECJ int the Schrems II ruling. In addition, data protection lobbyist Max Schrems has already announced that if the aforementioned framework is introduced , he will again take it to the EU Court of Justice.

Update on EU guidelines on Codes of Conduct for data transfers

Another mechanism that enables data transfers to third countries can be an approved code of conduct governing such transfers providing appropriate data protection safeguards, Art 46 (2) b, 40 GDPR. To clarify the respective to-dos, on 4.3.2022, the European Data Protection Board ("EDPB") published an updated version of its "Guidelines on Codes of Conduct as tools for transfers".

The Guidelines provide a handy checklist of elements to be covered by codes of conduct. As a rule, codes of conduct must address the essential principles, rights, and obligations for data controllers and processors according to the GDPR. Furthermore, guarantees specific to the context of the transfers must be provided. In particular, transfers must include a warranty that the laws applicable in the country the personal data are transferred to prevent the importing data controller or processor from fulfilling its obligations under the Code of Conduct. The Code of Conduct must be approved by a competent supervisory authority to become a ground for data transfers.


New DPR for European Patent Organization

The administrative council of the European Patent Organization adopted new Data Protection Rules (DPR) which went into force on 1.1.2022. Goal of the rules is to follow international standards and adopt best practices in all activities of the European Patent Office (EPO). In particular, the new rules strive to better implement the principles of proper data protection and to ensure that data subjects can exercise their respective rights effectively.

Enforcements & Fines

Among a large number of smaller enforcements, some major cases stand out.

Clearview AI

Clearview AI, a US based company, was fined 20 million Euro by the Italian Supervisory Authority because it processed personal data without the required consent. Clearview AI processes sensitive personal data such as biometric and geolocation information, in particular facial recognition from public web sources. Its services and technology are often provided to US law enforcement agencies to identify criminals. While this may be laudable, Clearview processes personal data of EU citizens stemming from EU sources without proper legal basis. While law enforcement can in principle provide such basis, it is hard to see how enforcement needs of US agencies can justify data processing in the EU. Furthermore, Clearview AI failed to address fundamental principles of data processing under the GDPR, such as transparency, purpose limitation, and storage limitation. Furthermore, it provided no information according to Article 13 and 14 GDPR as to where and how the information processed where collected. Lastly, it failed to provide information about data subject rights according to Article 15 GDPR within the due timeframe.

Meta Platforms Inc.

Meta Platforms Inc. was fined by the Irish Data Protection Commissioner ("DPC") 17 million EUR for a string of data breaches that happened in 2018. Meta infringed Articles 5 (1) f, 5 (2), 24 (1) and 32 (1) GDPR, mainly by not adopting appropriate technical and organizational data protection measures. As a consequence, hackers were able to gain access to personal data of approximately 50 million Facebook users, resulting in a major data breach.

Berlin data protection authority finds Cisco Webex not compliant

The Berlin Data Protection Authority (Berliner Beauftragte für Datenschutz und Informationsfreiheit) found the use of the SAAS web conferencing software Cisco Webex to be not compliant with the GDPR.

The main criticisms were the unlawful transfer of personal data to the US, a flawed DPA agreement, and the access powers of U.S. authorities, which are illegal under European law. These criticisms can be applied to most other similar systems as well.

Google Analytics found to be not GDPR compliant

Google Analytics, a tool that enables the tracking for user behavior, has been a suspect for data protection issues for a long time. Decisions of the Austrian Data Protection Authority (22.12.2021) and on 15.2.2022 by CNIL, the French data protection authority, found Analytics to be lacking because the tool transfers data to the US and thus creates risks for the personal data of EU citizens concerned as there is no viable ground for such transfer was in place.

In a decision that was published 6.1.2022, CNIL, the French data protection authority, fined Google 150 million Euro. Users on Google and YouTube have a much harder time rejecting cookies than simply accepting them, a so called "dark pattern". While the latter can be done with one click in bulk, there are several clicks required to reject cookies. As every internet user knows, it is rather annoying to click oneself through multiple pages with explanatory text, so simply accepting everything was the default option chosen by most users of the Google services, thus making the cookie consent less voluntary than it should be.