Opaque menaces to online speech: chilling effect and shadowban
Written by
Lahis Kurtz (See all posts from this author)
1 de March de 2021
Get to know two buzzwords of online content
A legal-mediatic charade
Music can prevent people from expressing themselves, even at normal volume.
A Los Angeles activist, known for broadcasting police approaches live on his social networks, enters a police station. The policeman, seeing that he is being filmed, turns on a song on his cell phone and stops responding to the conversation.
This story (real, whose source is the Vice News magazine), which seems like a charade in which we must find out what is happening, has a relatively simple answer.
The United States Digital Millennium Copyright Act requires transmission services like this to take measures to prevent the protection of these rights from being circumvented. As a reflex, algorithms are used for detecting and removing, automatically or by flagging, content that is recognized as copyrighted material.
Silencing side effect
For most content, there are no legal consequences that would force platforms to proactively detect and remove posts – whether in the United States law on intermediaries or in the Brazillian Internet Bill of Rights (Marco Civil da Internet). We have seen this in other blog posts, like this one.
This regulation follows the logic that, if the service was responsible for damages caused by content that users post, one possibility would be for the service to remove by default any content considered suspicious, without further analysis. Much content that is legitimate and that could be an exception to the rule would be deleted, that is, the mere risk of indemnity would already operate a limit on free expression. And that would be a very high cost to pay, as pointed by IRIS contributions to Bill 2630 or to the Santa Clara Principles.
The case of music played by the police can be an example. This is what is called the “chilling effect”, aka “discouragement”, in this case, to exercising of rights as a side effect of a law or government act. The algorithm designed to comply with copyright law certainly did not seek to overturn content containing police approaches. But he ended up doing just that when he indiscriminately filtered the transmission that contained a song in the background. And this is not a single episode, there are several reports from the Electronic Frontier Foundation on the problems of this type of copyright protection.
Invisibility Cloak
Now, imagine that an artist created content and decided to publish it on some social network or platform. Let’s say that person has many followers. She expects the content to be seen, shared, in short, to be received by a large number of people. But, when some time passes, she realizes that it had very few views. When commenting with close contacts, she learns that that post did not appear for them.
This is one of the most opaque types of content control: when a post has its reach reduced without any warning. Someone has their content turned invisible (or silenced) without even knowing it. The targets of that control do not know that this is in fact occurring until they have concrete evidence. This is called “shadowban”. It affects several categories of businesses and artists, to the point that there is a petition for pole dancers to stop having their content restricted in this way. There is also a homonym show that makes reference to how social networks relate to burlesque artists and professionals in the field. The responses to the protests are still unsatisfactory, as reported on this blog by an artist.
Opacity levels and effects on free speech
These two concepts and episodes were not brought up at random: it may have already been possible to perceive, but they are linked to transparency and freedom of expression. These two subjects are very connected and different situations can illustrate different aspects of this interconnection.
In the case of the activist with the police, copyright law is clearly not a way to restrict expression. It seeks to protect the rights of creators. So at first it may seem that it is not about freedom of expression. The narrated episode reveals to us that we can perceive that there is something strange. In it, we can identify that something doesn’t sound right (and it’s not the music). It makes us think that the copyright mechanism could be revised, that we need to at least discuss the matter. It also shows that there is a level of opacity in the effects of this type of protection, where the focus is on a certain right and side effects are not perceived on another.
Another type of opacity is when we can’t even know if/how content restrictions happen. In these situations, the problem is more profound, because although the restriction does not come from a law, it happens systematically, but even less subject to questioning its criteria. If something that undergoes shadowban is not going to be signaled by the platform, how can I tell if there are patterns being followed, or even what they would be? It is not possible to know if all content is being restricted, or if only those that meet the unknown criteria are restricted. It is difficult to criticize or even question a form of control whose reach is unknown.
When discussing ways to make social networks safer and more user-friendly, it is common to overlook the reflexes of some measures that apparently only protect the public. It is necessary to reflect and think about situations where they are excessive and limit what can or cannot be said and, with that, what can be known. Transparency is an important measure to allow change proposals to the parameters of what may or may not circulate in networks.
In the survey that I worked, we analyzed the levels and gaps of transparency in the community policies of 8 major platforms. To find out what recommendations we have made to ensure a free and democratic social network environment, read our paper!
The views and opinions expressed in this blogpost are those of the author.
Illustration by Freepik Stories.
Written by
Lahis Kurtz (See all posts from this author)
Head of research and researcher at the Institute of Research on Internet and Society (IRIS), PhD candidate at Law Programme of Federal University of Minas Gerais (UFMG), Master of Law on Information Society and Intellectual Property by Federal University of Santa Catarina (UFSC), Bachelor of Law by Federal University of Santa Maria (UFSM).
Member of research groups Electronic Government, digital inclusion and knowledge society (Egov) and Informational Law Research Center (NUDI), with ongoing research since 2010.
Interested in: information society, law and internet, electronic government, internet governance, access to information. Lawyer.