Blog

Artificial intelligence and discrimination against women: the data and the system

Written by

8 de March de 2021

On International Women’s Day, it is good to reflect on how artificial intelligence replicates and even enhances gender discrimination – and others.

Artificial intelligence, automated decisions, and discrimination

There is a lot of content that explains artificial intelligence, algorithms, and machine learning techniques. I will not go into concepts, but I leave the indications of some names like Ada Lovelace, Katy O’Neil, and Ivana Bartolletti for those looking for work on algorithms, as mathematical formulas come to be used in various technologies, including in decision-making processes based on the increasingly giant data economy.

This text also does not want to maintain that the uses of artificial intelligence, or that field of knowledge, should be abandoned or launched to developments not wanted by humanity. The point is that the applications of artificial intelligence are fed by databases already built – by a certain group of people, with specific characteristics and corresponding to historically dominant groups. There is a necessary and urgent compromise to be taken by different sectors of society so that technologies (based on data or not) cease to reproduce discrimination, exclusion, or processes of marginalization.

Nothing new under the sky

Although unfortunately, there are several and often intersecting processes of discrimination, such as racial, March 8 is an international invitation to reflect on the long road that we need to follow, as a society, towards gender equality. Artificial intelligence applications are no exception to reality.

There are many examples and, possibly, many that do not even reach public knowledge. Even in trivial situations, like the suggestion of a restaurant, prejudice appears: the facial recognition program in an Oslo mall suggested salad if it detected the presence of a woman in front of the “automated” banner and pizza if it recognized the masculine trait. In addition to collecting personal data (biometric, sensitive) indistinctly and without the consent of consumers, the machine was still sexist – and aimed at beauty standards imposed on all of us.

If an example of a restaurant menu seems unimportant, being passed over for a job, in turn, is far from it. In one of the most famous cases of algorithmic discrimination against women, Amazon’s artificial intelligence recruitment system would price female candidates. Having “learned” from a database made up of a male majority of former employees, the tool reinforced the pattern of a time when women did not even enter the job market.

When tested, voice search tools – which are also trained from the databases of a sexist society – revealed female stereotypes that were subverting and susceptible to abuse. The so-called “virtual assistants” reflected patterns of submission and violence against women, which could be naturalized by the many people who use this technology. The very fact that these systems are linked to the female gender reinforces standardized images of women as caregivers, assistants, or objectified.

Silence is not a good option

If more and more fields require advancing towards gender equality or an end to discrimination and violence against women, what should we do? The first thing seems to be the simplest and also the most difficult for many cases: breaking the silence, denouncing, standing up against situations that need to be interrupted. In the field of technology, this also involves inserting more women into the development, improvement, and applications of artificial intelligence, or in other areas of computer science.

Mobilization is also an important step so that our voices are considered in the processes of development and application of digital technologies. Therefore, it is crucial to produce knowledge, to seek more representative and diverse alternatives. It is also important to listen and be heard at events such as the Equity Generation Forum, focused on gender equity, and to engage in awareness campaigns for more representation in an internet-driven society, such as #MulheresnaGovernança. The road is long, but changing the logic of the song [one famous among Brazilian children, which also has its problems there], the road must not be deserted. We meet there!

The views and opinions expressed in this blogpost are those of the author. 
Illustration by Freepik Stories.

Written by

Founder and Directress at the Institute for Research on Internet & Society. LL.M and LL.B at the Federal University of Minas Gerais (UFMG).

Founder of the Study Group on Internet, Innovation and Intellectual Property – GNET (2015). Fellow of the Internet Law Summer School from Geneva’s University (2017), ISOC Internet Governance Training (2019) and the EuroSSIG – European Summer School on Internet Governance (2019).

Interested in areas of Private International Law, Internet Governance, Jurisdiction and Fundamental Rights.

Categorised in:

Leave a Reply

Your email address will not be published. Required fields are marked *

Veja também

See all blog posts