Blog

The Telegram issue from the perspective of content moderation

Written by

15 de March de 2022

Telegram is a messaging app with growing popularity, already present in 60% of Brazilian smartphones, according to a report published in February 2022. Given the absence of legal representation in the country and the failure to establish a dialogue with the authorities (Superior Electoral Court and the Federal Public Ministry), its blocking by the Judiciary has become increasingly likely. The main concern is that its resources (such as the creation of groups with more than 200 thousand members and the sending of trigger messages) will be used to propel disinformative discourse in this year’s electoral scenario, intensifying the so-called Digital Populism.

The purpose of this post is not to be an opinion piece or a news text about the Telegram agenda, but to analyze the situation by bringing in the perspective of content moderation, a topic of study at IRIS. 

First, a bit about the possible blocking of Telegram

The possible blocking of Telegram has been a topic that raises heated opinions. In a pre-election context, much is being said about the popularization of Fake News via groups and mailing lists, a justifiable fear after the 2018 elections. However, the word blocking brings up apprehensions related to the users’ freedom of expression, proportionality and effectiveness, as well as questions about the legal and technical mechanisms that actually make this action possible.

In this scenario, the opinions in favor of blocking call attention to the fact that Telegram representatives deliberately refuse to establish any kind of communication with the Brazilian authorities (up to the moment of publication of this text), standing out as the only major app that has not agreed to participate in the agreement reached with the TSE, in February, to combat the spread of disinformation in the electoral process. Given this, Telegram establishes almost its own regime of rules – an issue that will be further elaborated on in the next topic. Nevertheless, the company operates in Brazilian soil, holding data and establishing communications services for millions of users. Furthermore, the application – like many other messaging apps or social networks – contains deliberately antidemocratic groups, which defend fascist, xenophobic, anti-vaccine, pro-dictatorship speeches, among many other topics that can be found through a simple search in the application’s browser. Another pro-blocking argument is that Telegram is just one of the communication media to which we have access, so its suspension would not imply the impossibility of communicating in virtual media. 

The opinions against the blocking, in turn, call attention to the fact that blocking the application would not be an instant answer to the issues at stake, and that it would be possible to carry out technical strategies to use the application in Brazilian territory by exchanging VPN (Virtual Private Network), using a private network in another country.  Moreover, as Paulo Rená states, despite the illicit uses, Telegram also includes several licit uses, so that all users would pay the price for only a portion of them, highlighting the abusive character of the measure. Still, there is no consensus as to the legal mechanism that makes blocking possible, and there are specialists who even say that there is no legal provision. 

What about content moderation?

To introduce the discussion on content moderation, we return to one of the issues presented by those who defend the possible blocking of Telegram: the lack of response by representatives and the apparent creation of its own regime of rules. Despite the notable resistance to communicating with Brazilian authorities, Telegram does in fact develop its own guidelines and platform policies in a lawful manner. This is because the regulatory regime for content moderation introduced by the platform is the self-regulation.

Explaining the regulation models, it is possible to divide them into three main categories: self-regulation, hetero-regulation, and coregulation. The first is one in which platforms, as private agents, can either develop their own internal regulatory guidelines or apply them, according to their preferences. It is the predominant model of content moderation of the big platforms, in which the providers have autonomy and freedom to also apply their own algorithmic logics without the interference of other agents.

The second model is heteroregulation, a regime in which the moderation rules are applied by an agent external to the one being regulated. This is the case in scenarios where governments decide how the policies of each platform will be.

Finally, in co-regulation, the agent that is regulated participates in content moderation by also developing and enforcing the rules in question – but does not do so autonomously. The authorities act in conjunction with private agents. This co-regulatory model, in turn, is advocated in the Santa Clara Principles and is often seen as an alternative and balanced route to content moderation – although not yet widely explored in the theoretical and practical field.

Having said this, let’s get back to Telegram. The application is inserted in the large set of so-called self-regulatory platforms, that is, platforms that develop, decide and apply their internal rules without the participation of external agents – they self-regulate themselves. Not coincidentally, this is the most criticized content moderation regime – among the three presented. However, the commercial power that the platforms have as large market companies is one of the factors that causes this regime to remain predominant, since to yield to external regulation is also to have economic interests affected

Once the regulatory regime in which Telegram is embedded is understood, it is worth reflecting on what can be accomplished and by whom. Despite self-regulation, as a platform and company operating in Brazil, the application is subject to the authorities and national legislation, so that the failure to establish any communication affects jurisdictional issues. In this scope, although the Brazilian authorities themselves cannot change Telegram’s internal policies, once the dialogue takes place, there is the possibility of positive measures being indicated, followed by implementation, by the application’s providers. This is what happened, in turn, with WhatsApp, which entered into an agreement with the TSE to combat disinformation in the 2022 elections. Another example is the case of Facebook and Instagram, which entered into a partnership with the Court for the same purpose, which involved the creation of a channel for complaints directly to the TSE.

Following the developments

Whether for or against the blocking, it is important to keep in mind that Telegram is not a villain, but a communication tool: an instant messaging application. Thus, it is necessary to develop a critical sense to analyze the antidemocratic actions that have the application as a channel, and what is the best way to effect regulation and address the issues intensified by the network society.

The views and opinions expressed in this blogpost are those of the author. 
Illustration by Storyset.

Written by

Researcher at the Internet and Society Reference Institute. Bachelor of Laws from the Federal University of Minas Gerais (UFMG) and Master in Private International Law from the same institution. Member and project leader in the area of ​​Digital Inclusion. Areas of interest: Content Moderation, Digital Inclusion, Digital Populism, and Political Law.

Tags

Categorised in:

,

Leave a Reply

Your email address will not be published. Required fields are marked *

Veja também

See all blog posts