Apple and the myth privacy x security
Written by
Luiza Brandão (See all posts from this author)
10 de August de 2021
The integral protection of all children is indisputable. Producing, circulating, and consuming child sexual abuse materials (Child Sexual Abuse Material) are heinous crimes and must be tackled seriously, effectively, and urgently. The safety of children and adolescents is a priority for our future as a society. It includes ensuring your privacy, data protection, and your right to your image. Therefore, an important and possible first step towards constructive proposals is not to see these two values as opposites.
Apple’s proposal
Under the justification of fighting CSAM, Apple officially announced the implementation of a mechanism that would detect the content of materials possibly of that nature. It would be in line with facilitating the authorities’ access to evidence on crimes involving child exploitation and abuse. Generally speaking, media uploaded to iCloud would automatically look for hashes to see if they are part of the National Center for Missing and Exploited Children’s database. The base includes materials already identified hashes as CSAM.
Automatic detection implies the use of technology that, according to the company, ‘guarantees less than a chance in a trillion per year of incorrectly flagging a given account’. Apple has included the ad in the ‘child safety’ category and mentions that it is part of a set of efforts against child abuse, which includes guidance for combating it through company mechanisms such as voice assistant Siri, which even already had its problms.
The announcement follows new scandals about the privacy of people worldwide and the (equally global) market built to provide tools for massive surveillance. In this sense, it can be seen as a step backward from the discussion to incorporate tools that protect people against abuses in their private lives, communications, and devices. For anyone who has heard Edward Snowden’s revelations and sees increasing vigilance over society, this shift by a digital economy player the size of Apple is worrisome worldwide.
Reactions to the announced new technology
Reactions to Apple’s announced changes are numerous and raise serious concerns. From a privacy point of view, all content uploaded to iCloud will be scanned and logged from features that cannot turn off. It is worth remembering that the company’s worldwide distribution makes the practice possibly contrary to several national laws, in addition to international regulations that ensure the inviolability of private life
The measures make use of technical resources to map the material. They relativize the security promoted by cryptography and are known as forms of exceptional access that, in practice, lead to its weakening. The use of client side-scanning techniques, such as the one proposed by Apple, has been criticized by several organizations. One of the biggest concerns is that, under the euphemism of ‘exceptionality’, violations to privacy and security of all users open up. It is important to understand that the insertion of a technique like this undermines all devices’ security and the users’ privacy. It also opens up margins for abuse. This is also a debate about proportionality, adequacy, and effectiveness. After all, criminal investigations, criminal prosecution, and the effective fight against crime involve much more elements and challenges than the weakening of cryptography through this technique can offer, given the high risks it entails.
Security and privacy go together
Apple’s unveiled new technology for child safety raises the concerns which I have sought to encapsulate here. However, the fact that organizations have pronounced about the news demonstrates how debatable the tool is. At the heart of the matter is the understanding that we cannot give up privacy for security, nor vice versa. Since technical measures weaken the protection of one of these elements, the other is also at risk.
Combating CSAM and punishing crimes involving these materials is undeniable. Apple, as a global company, must keep in mind the safety of its users. It also cannot make room for violations of rights, including those of children and teenagers. The integral protection given to them includes privacy, inviolability of communications, and presumption of innocence. The risks of weakening security and encryption are too high. They are contrary to the advances to defend the rights of the entire society. Technological measures, as much as the regulatory, normative, and market ones, must enhance security and privacy, not relativize them – however relevant their causes are.
The views and opinions expressed in this blogpost are those of the author.
Illustration by Freepik Stories.
Written by
Luiza Brandão (See all posts from this author)
Founder and Directress at the Institute for Research on Internet & Society. LL.M and LL.B at the Federal University of Minas Gerais (UFMG).
Founder of the Study Group on Internet, Innovation and Intellectual Property – GNET (2015). Fellow of the Internet Law Summer School from Geneva’s University (2017), ISOC Internet Governance Training (2019) and the EuroSSIG – European Summer School on Internet Governance (2019).
Interested in areas of Private International Law, Internet Governance, Jurisdiction and Fundamental Rights.