The controversial practice of shadow banning by social media platforms has been recently scrutinized for its compliance with GDPR by a court in Belgium. The Court of Appeal in Ghent in its decision of 3 June 2024 found that shadow banning practices used by Meta and imposed on Belgian right-wing politician Tom Vandendriessche were in breach of provisions on automated individual decision-making (Article 22 of the GDPR), as well as provisions on information and transparency requirement regarding automated decision making (Articles 13(2)(f) and 14(2)(g)).
Meaning of “shadow banning”
The term “shadow banning” has evolved primarily in public debates to encompass actions ranging from complete suppression to reduced visibility of certain accounts or posts through search and recommendation features on platforms. In either case, the affected party is not aware of the sanction imposed. In the Belgium case, the court only dealt with shadow banning in the form of reduced visibility (i.e. “the reduction of the organic reach of the content”, as the court put it).
Meta’s shadow banning amounts to automated decision-making
In the case in Belgium, due to violations of Facebook’s terms of use and Community Standards by Tom Vandendriessche, Meta removed some of his messages and posted content containing hateful language and imposed a shadow ban on some other. The court found that the removal and shadow-banning decision amounted to automated decision-making within the meaning of Article 22(1), as it did not entail “meaningful human intervention”.
The court made a distinction between sanctions involving the removal of several messages on the one hand and shadow banning on the other and found that only in the latter case the data subject was “affected significantly”. The court found that only shadow banning fell under the general prohibition under article 22(1) of the GDPR as shadow ban led to a “significant reduction of the organic reach” satisfying the condition that the data subject was “affected significantly” by automated processing. Evidently, the court considered that Meta’s removal of only a handful of Vandendriessche’s messages fell short of the threshold for “affects significantly”.
Meta did not take appropriate protection measures when imposing the ban
The court further concluded that shadow banning fell under an exemption under Article 22(2) of the GDPR. In other words, if certain conditions were met, Meta could lawfully make a decision based solely on automated processing, as shadow banning was necessary for the performance of a contract between Meta and Tom Vandendriessche. That necessity arises from the fact that Facebook Terms of Service and Instagram Terms of Service require Meta to ensure that Facebook and Instagram are safe for all users (including Tom Vandendriessche).
The Court nevertheless concluded that in the specific instance Meta failed to satisfy the conditions that would entitle the company to rely on Article 22(2). Those conditions are prescribed in article 22(3), under which the data controller must implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision. Meta did not take such suitable measures when imposing the ban.
Meta privacy policy does not comply with information and transparency requirement
The court also considered whether Meta’s privacy policy complied with information and transparency requirements regarding profiling and automated decision-making under articles 13(2)(f) and 14(2)(g) of the GDPR. The court concluded that Meta infringed those provisions as it did not provide information on the existence of automatic decision-making, the logic underlying automated decision-making, and the significance and expected consequences of that processing.
Comment
The judgment of the court in Ghent is one of two recent pronouncements by courts in the EU about the practice of shadow banning. While the court in Ghent ruled against Meta for violation of the GDPR, the Amsterdam District Court found that X (former Twitter) acted contrary to the Digital Services Act (DSA). X failed to provide to the applicant (a user of X) “a clear and specific statement of reasons” for “restriction on the visibility” of the information provided by that user. Under Article 17 of the DSA, a statement of reasons must include specifics about the restriction imposed, the facts and circumstances considered, the use made of automated means in taking the decision, the legal grounds relied on, and possibilities for redress, among other elements.
The two cases illustrate that an affected party may challenge shadow banning by relying on two legislative pieces with direct applicability in the EU member states: the GDPR and the DSA. The prohibition under GDPR Article 22 is limited to shadow banning done by automated means, while the prohibition under DSA Article 17 is more general, i.e. does not depend on the means (automated or non-automated) used by the provider of hosting services. Both provisions, in essence, preclude shadow banning, except in some specified cases. By requiring that the platform notify the affected individual, the provisions reject the constituent element of shadow banning – secrecy.