Blog

Automated decisions and algorithmic transparency

Written by

6 de November de 2019

Whether at the European Union, in which the GDPR is in effect (General Data Protection Regulation), whether in the Brazilian legal framework, in which the General Data Protection Law (Law No. 13709) is about to enter into full force, one of the issues that most intensifies debate regarding the applicability of these laws is a possible (in)compatibility between trade secrets and the right to review and the right to explanation automated decisions. As we seek to explain the rationality and parameters behind algorithmic decisions, we also question the sustainability of business models based precisely on the competitive advantages of the automation resulting from these applications. In today’s highly important political-economic context of surveillance capitalism and  digital value production (GAFA – Google, Apple, Facebook and Amazon), is it really possible to guarantee algorithmic transparency? This is the question this post seeks to answer.

Trade Secret: an algorithmic black box?

One of the most important aspects about trade secrecy is that it stems from a relational concept concerning competition between certain companies in certain markets. Unlike other categories of intellectual property, such as trademarks and patents, companies with trade secrets do not seek a specialized governmental registry to ensure the novelty trait and precedence of their registration. 

Trade secrecy refers to a competitive advantage in the company’s business model, whose secrecy and confidentiality is precisely what ensures its protection against competitors. As long as these processes, formulas, know-how, methodologies, production flows, organization and techniques are kept secret, the company also ensures its competitiveness in the market. For this reason, companies that deal with trade secrets in their daily lives use confidentiality agreements for employees, business partners, among others who may have access to them, as well as specific procedures to prevent leak of this information.

Regarding digital economies, even though computer program codes may be the subject of different categories of intellectual property protection depending on the jurisdiction (US patent protection and a specific category of copyright protection in Brazil), several companies decide to protect them as trade secrets. In addition to different levels of technical protection, its complexity is such that it is difficult to accurately identify all the steps and parameters involved in the decision making of an algorithm, which corroborates the idea of a trade secret. 

For example, Google’s search engine is known to take into account users’ personal data, geolocation and browsing history, to determine which links will be most relevant in ranking items on its results page. Precisely the use of a myriad of personal information in making decisions for its users, Google is deemed to be one of the most efficient, fast and useful in its business. Likewise, companies understand that protecting the decision-making steps and parameters behind these algorithms, that is, their trade secrets, is precisely what makes them more competitive in their respective relevant markets. In this context, algorithmic protection represents an important step towards maintaining the competitive standards of these business models.

A right to explanation and a right of review of automated decisions

As a goal of ensuring a minimum of transparency and accountability for companies using automated processing of users’ personal data, both the General Data Protection Regulation (GDPR) and its Brazilian counterpart (Law No. 13.709) provide for minimal control instruments of these decision-making processes implemented by digital economy companies. Under GDPR, users are entitled to transparent information, access to personal data that companies use, data portability, objection to certain profiling, among other prerogatives. 

In Brazil, there is a right of review of automated decisions (Article 20 of Law No. 13,709). As the law currently states,

“the data subject is entitled to request the review of decisions made solely on the basis of automated processing of personal data affecting his interests, including decisions to define his personal, professional, consumer and personal data, credit or aspects of your personality”.

However, thanks to the presidential veto, this revision is no longer necessarily through human intervention. As if the risks to automate even this review process (which in the European Union requires human participation in the contestation process) were not enough, Brazilian law still requires the provision of information about the criteria and procedures used for the automated decision, except when it threatens trade secrets. That is, even minimal criteria for automated decision transparency are limited to the trade secrets of the company that controls the data.

European legislation goes further and also provides for a right to explanation (Recital 71), which is an interpretative and operational rule of the GDPR, without necessarily the same binding power as an article of the body of the regulation. It determines that automated data processing that performs personal aspects assessment, produces legal effects (exemplifying with practices credit scoring online and recruitment) and that does profiling on economic, work, health or personal interests of users should include specific measures with explanatory information to the holder of the decision data, with the possibility of contesting the decision. 

Brazilian researchers, such as Prof. Renato Leite Monteiro, understand that a comprehensive interpretation of LGPD, in conjunction with the Constitution, consumer law and other legal provisions, guarantees the existence of a right to explanation in Brazil. However, this position demands greater jurisprudential consolidation. By way of comparison, the researcher points out that GDPR cites trade secrets only once, despite being a more comprehensive and detailed law in relative terms, while LGPD does so in 13 instances, as opposed to algorithmic transparency criteria. This denotes the importance given in literal terms by Brazilian legislation to the protection of trade secrets, in view of their eventual relativization due to users’ rights to review and informational transparency.

Although conceptually distinct, it remains to be seen in practice how the right to review and the right to explanation will apply in both European and Brazilian jurisdictions. In practical terms, there is a degree of insecurity on the part of strong technology-based companies, especially startups, about the limits and responsibilities of their business models. On the other hand, there is greater support to the population regarding the rights they have from the ownership of personal data. Certainly, legislation represents an advance in terms of guaranteeing the fundamental right to privacy.

Competitive issues, algorithmic transparency and innovation: towards a progressive view of intellectual property

One aspect that is scarcely mentioned in the right to explanation and review automated decisions concerns the competitive regulation of digital markets. For example, in the case of the Google search engine mentioned earlier, it is important to note that it is not just users who have interests in ranking results displayed to them and in accessing relevant information from the search engine. Companies striving for the attention of their consumers are also interested in greater search engine transparency, as they need to understand which criteria provide better rankings to drive more search engine users to their institutional sites (news portals, e-commerce, official websites, etc.).

The European Commission, through its antitrust authorities, has even regulated parts of Google’s business model, including imposing library fines for anti-competitive practices regarding its search engine. It was found that through different strategies, the company was favoring its own parallel applications on its search results page, stipulating abusive clauses for the insertion of advertisements in Google search applications installed on third party sites, and making it difficult to install other search tools on Android devices. In these cases, part of the investigations into Google’s anticompetitive actions stem from empirical studies, competitor complaints and results pages. Now that the company also has informational obligations regarding the parameters and criteria used in automated data processing, there will be greater public scrutiny, including for antitrust purposes, of its search engine results.

There is already evidence that trade secrets are growing in relative importance, especially in the intangible asset portfolio of digital economy trade models. If we consider this trend as opposed to other categories of intellectual property, such as patents and copyright, a fundamental difference is the fate of these innovations after a period of protection. Patents expire and may be used by other competitors after this deadline. Copyrighted literary creations fall into the public domain and guarantee the right of access to such works by the entirety of the population, including for adaptation purposes. That is, after a period of protection, there is a reversal of these innovations to society.

However, the same does not happen with trade secrets. Depending on the secrecy and confidentiality measures taken, as well as the market’s inability to achieve these innovations on its own, these secrets remain confined only to the scope of companies and those who have access to them. There is no further transfer of these innovations to society and the competitors of its niche market.

Conclusion

This text advocates neither the end of trade secret to algorithmic business models, nor an absolute right to explanation. On the contrary, I believe that the right to review automated decisions is an important first step toward algorithmic transparency, especially one based on human intervention in the review process. The right to explanation in Europe goes further, and provides more substrates for the possible practical verification of biases, anti-competitive practices and other misuse of personal data. By looking at these basic criteria for parameterizing automated decision-making, we can think of a fairer achievement of respect for privacy, informative self-determination, economic and technological development, innovation, free enterprise, free competition, and consumer protection, all of these which are fundaments of data protection in Brazil.

Want to know more about the right to explanation and artificial intelligence? Check out this post here!

The views and opinions expressed in this article are those of the authors.

 

Written by

Founder and Scientific Advisor of the Institute for Research on Internet and Society. Law Professor at Universidade Federal de Juiz de Fora. Has a Master and a Bachelor degree from the Federal University of Minas Gerais, with a scholarship from CAPES (Coordination for the Improvement of Higher Education, a Foundation within the Brazilian Ministry of Education), and is currently a PhD student at the same institution. Specialist on International Law by CEDIN (Center for International Law).

Assistant professor for the International Economic Relations and Law Courses at the Federal University of Minas Gerais. Lawyer and member of the ABRI (Brazilian Association for International Relations)

Leave a Reply

Your email address will not be published. Required fields are marked *

Veja também

See all blog posts