What’s Up, WhatsApp?!

In a GDPR urgency proceeding, the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI), Johannes Caspar, has taken action against Facebook. The aim of this proceeding – whose decision is expected before May 15, 2021 – is to comprehensively protect WhatsApp users in Germany, who are confronted with the company’s new terms of use. Against this backdrop, May 15 can be emphasized as an important deadline, since by that date users must consent to data processing by the parent company Facebook. The fear is, that the data will be used in particular for marketing purposes, which goes beyond the scope of analysis and security.

After Facebook had announced the new terms and conditions at the beginning of this year, a discussion arouse. As a result, the company decided to postpone the introduction to May. With many million Whatsapp users in Germany alone, Johannes Caspar now stressed the importance of having functioning institutions in place, in order to prevent the misuse of data power. At this point, Caspar could not exclude, that the data-sharing provisions between WhatsApp and Facebook would be enforced illegally, due to the lack of voluntary and informed consent. In order to prevent a potentially unlawful exchange of data and to put an end to any impermissible pressure on users for giving their consent, the formal administrative procedure was initiated.

Based on Art. 66 GDPR (“exceptional circumstances”), the emergency procedure is aimed at the European headquarters in Ireland. The American company is given the opportunity to state its position, whereby it can be expected that Facebook will consider the adjustments to be sufficient. However, the Hamburg data protection authorities had already issued an injunction against such data matching in 2016. Although Facebook took legal action, the company did not prevail in court (OVG Hamburg, February 26, 2018 – 5 Bs 93/17 – K&R 2018, 282).

The outcome of the proceedings in Hamburg is eagerly awaited since it may have an impact on the entire European market, given the direct applicability of Art. 66 GDPR in the different Member States. Although the decision from 2018 could delineate a trend, the outcome is open.

Dario Henri Haux

See the media statement at: https://datenschutz-hamburg.de/pressemitteilungen/2021/04/2021-04-13-facebook

Unwanted Email Advertising: Trivial Damage or Significant Data Protection Violation?

In a case concerning damages for unwanted email advertising as a data protection violation, the Federal Constitutional Court in Germany (BVerfG) recently ruled, that the European Court of Justice (ECJ) has to decide on the interpretation of the prerequisites and scope of Art. 82 (1) GDPR. It has to be clarified how Art. 82 (1) GDPR is to be interpreted against the background of Recital 146. In other words, under which conditions the article would grant a claim for monetary compensation.              

The plaintiff, a German lawyer, had received one (!) unintentional advertising email and sued for injunctive relief, access to the stored data and damages of at least 500 €. Whilst the District Court of Goslar (September 7, 2019 – Case No. 28 C 7/19) upheld the claim for injunctive relief and access, it refused to award immaterial damages. The judges claimed that these were not evident. Hence, the threshold for a monetary compensation for a violation of personality rights had not been exceeded. In response, the plaintiff decided to file a constitutional complaint, arguing that the decision of the District Court violated his right to the lawful judge of Article 101 (1) sentence 2 of the German Basic Law (GG). He claimed that the District Court had wrongly refrained from submitting the question of the threshold for GDPR damage claims to the ECJ for a preliminary ruling.            

In the ruling of the BVerfG, the judges now emphasized that the claim for damages under Art. 82 GDPR may not be denied just because of a minor or trivial loss. The Federal Constitutional Court underlined in fact, that the District Court would have had to make a preliminary reference to the ECJ beforehand. Hence, the plaintiff’s right to the lawful judge according to Article 101 (1) sentence 2 of the German Basic Law (GG) had been violated: the District Court did not comply with the obligation to refer the matter to the ECJ by way of preliminary ruling proceedings pursuant to Article 267 (3) of the Treaty on the Functioning of the European Union (TFEU). The District Court disregarded this obligation by interpreting European Union law itself.                

The court in Goslar must now decide anew and it can be assumed that the judges will refer the questions to the ECJ. It is to be hoped, that the European judges will define a de minimis threshold for damages due to data protection violations. At the same time, clarification of all underlying issues is not expected. However, general principles of high practical relevance can be laid down, already discussed in literature and case law.

Dario Haux

German Federal Constitutional Court, Decision of the 2nd Chamber of the First Senate, January 14th 2021, 1 BvR 2853/19 -, 1–24

The Court of Rome orders a mother to stop publishing on social networks contents relating her underage son

With the countdown to the GDPR almost at its end, this interim order from the Court of Rome (full Italian text here) has been largely debated in Italy in the last days.

These are the facts, in sum:

  • During a divorce proceedings, a mother has been publishing on social networks many photos, videos, posts relating this lawsuit, including information about her son – a 16 years old guy;
  • The son was really frustrated by this situation. Details about him were constantly disseminated on social networks by his mother and his history – in all details – became known to all his schoolmates. He started suffering serious psychological effects – particularly he was scared of being discriminated and considered “different” by his mates due to his private “issue”.
  • For this reason, the guardian of the son (previously appointed in another proceedings) asked the Court to confirm his right to attend a US college to get a new life far away from this nasty situation.

In the interim injunction, the Court ordered:

(i) the mother to stop publishing on social networks and any other media, images, information and any other data relating her son, (ii) to remove all these contents published so far on social networks; additionally the Court (iii) fixed a monetary sanction for any violation of these orders.

(iv) the guardian of the son to ask search engines to de-list and social networks to remove all images, information and any data relating the young guy.

The interim decision is of course reasonable. The Court has not relied on a particular legal qualification of the matter, however apparently based on general civil law principles considering psychological damages suffered by the young guy. Thus the decision seems to refer to a general right to privacy (even before considering a question of fair processing of personal data).

The case confirms how social networks can be risky for our privacy due to their media massive effect. A part from this specific case, we should wonder about possible negative consequences of posting certain contents about third parties, that we might not foresee. This is particularly true for underage people, where particular attention is to be paid to their privacy – also in the long term (for a view on the possible negative consequences of parental oversharing, see for example here: ‘it’s difficult for an individual to control that information once it’s out there. When it comes to our children, we’re making the decision to put things out on their behalf, and what seems appropriate now may not be appropriate in ten years’ time’).

Even before the GDPR, the EU data protection legislation required consent for publishing contents about third parties on social media, with some exception as in case of news reporting (that remains mostly a matter of national law). But not – of course – if it is a parent publishing data about his/her underage son. The GDPR is now paying new attention to:

(i) the processing of data of underage people (Recitals 38, 58, 65, 71, 75) – the GDPR requires parental consent for the use of information technology services (Art. 8). Although limited to this type of services, it sets the legal age for data protection choices (i.e., a “digital consent”) at 16 years old (there is some flexibility for national legislation, but this age cannot be below 13 years). If this applies also to the exercise of privacy rights is difficult to say (Recital 65 seems to confirm this option) and it shall probably consider national legislation as well. Recital 38 states that consent by a parent or guardian is not required in the context of preventive or counselling services offered directly to a child. For example, the provision of child protection services offered online to a child by means of an online chat service does not require prior parental authorization, clarifies the WP29 Guidelines on Consent under the GDPR (here).

(ii) the risk of discrimination – a risk we should often consider in certain personal data processing operations (we have discussed this aspect in relation to profiling activities here).

Francesco Banterle

Court of Rome – Judge Monica Velletti – order 23 December 2017