Author:

Picture of Christiane Féral-Schul
She is a barrister at the Paris and Québec Bars, a former President of the Paris Bar and is currently President of the National Bar Council until December 2020. For more than 30 years, she has practised in the fields of computer law and new technologies. Her legal firm FERAL-SCHUHL/SAINTE MARIE, co-founded with Bruno Grégoire Sainte Marie in 1988, has been a leading firm for many years.

Following International Women’s Day 2020, the French National Bar Council (CNB) gathered fifteen women lawyers, engineers, professors, researchers, programmers and entrepreneurs to seek solutions to gender bias in artificial intelligence algorithms.
Although the first ever programmer was a woman, today machines and algorithms are designed mainly by men. In 2020, only 17 percent of workers in the digital sector are female. Only the aeronautics sector attracted a lower proportion of female professionals. To counterbalance this, the CNB also promoted female role models to encourage women to take their place in this field.
In tech, women are at the end of the line. They are sometimes held back by stereotypes of “geeks” – an inevitably male figure wallowing at a computer screen with a box of cold pizza. Not only has this idea of the male tech nerd been proven to be passé, it also excludes the female figure from the digital world.
Artificial intelligence (AI) is often personalised and regarded as responsible and guilty for biases against women. In reality, however, AIs only analyse of our own biases. They express nothing more than the aggregated opinions of their creators. AI can thus reproduce the sexist tendencies of a human resources department to only consider the CVs of male candidates. Google the phrase “company director” and the most common result will be an image of a man wearing a tie. Conversely, search for “cleaning staff” and you’ll mainly see women in aprons. Voluntarily or not, algorithm designers tend to stir their own biases into the algorithms they design, and this has sexist consequences. Several types of bias can be encountered.
There are data biases which can have ethnically discriminating consequences. They are expressed, for example, through the automatic recognition of morphological criteria of the face or skin colour. They are a consequence of the fact that AI learning data cannot be representative if learning is based on a single, European standard. The biases of predictive algorithms thus perpetuate the past data which they absorb. Human freedom and initiative would be frozen if legal professionals were to become reliant on such algorithms.
Economic biases are more discrete and pernicious, but no less common. Algorithms assisting in the design of advertisements consider the predictable cost of deprioritising one population over than another for purely economic reasons. Thus they may be biased against women.
AI can discriminate blindly. But it is not responsible or guilty. Rather, it amplifies existing biases without any real possibility of rectification. A discriminatory bias is difficult to identify but even harder to repair. This is why we need more women, and people from minority groups, involved in the design of these tools. Their parameters must integrate the richness of our diversity of life and opinions.
Digitalisation transforms society and, while electricity exists, will continue to consume it. The entire economy is supported by the digital. Must every sector become masculinised as it becomes an emerging digital market?
Digital knowledge is masculine. This is true in France. Not everywhere else. In some places, digital is a promising sector for women because it allows them to work from home and can be better adapted to their constraints.
Rectifying gender bias requires finding ways to better integrate women in the business of tech analysis and research. One way could be to intervene at the training stage by making courses more attractive to women. Companies, too, should implement already existing internal organisational means to integrate women into decision-making processes. Diversity of recruitment is a simple measure to apply and contributes to fairer representation in a corporate structure.
Is it necessary to impose or convince, set up quotas or set up incentive mechanisms? For a long time, the word quota, was understood as the admission of less qualified applicants and made many people grind their teeth. However, introducing a quota does not prevent performance. Although it may not be the only way, it can achieve quick results. Competitiveness and success should not be gendered. A sector of activity that is not mixed is worrying because it is from diversity that richness is born.
Gender bias has an impact on girls’ academic performance. It has been observed that they lose self-confidence from secondary school despite good results in primary school. An exercise presented as a mathematics problem generates poorer results for girls. When this bias is deconstructed, boys perform less well. Changing course descriptions has an obvious impact. Teacher training is therefore extremely consequential. They have the noble task of accompanying children throughout their education. They must be the first line of defense against inequality.
It is therefore important to expose girls to the coding at a very early age, because not all stereotypes are formed in young children. Coding is no more complicated than writing, and code is not the exclusive property of engineers. It needs to be taught at school, to all children. Nor is there an age barrier; the women of today, as well as those of tomorrow, can take their place in the digital world granted the tools at their disposal. While education is a long-term solution, training is a short-term fix to establish women’s status and remedy gender bias.
That said, a surprising – perhaps frightening – observation has been made at a societal level: The higher the general level of equality in a country, the less women engage in digital studies. Conversely, the more countries are based on unequal systems, the less girls and boys live together, the more women move into digital channels. Should this lead us to ask whether co-education is a catalyst for women’s inhibitions to enter digital tech?
Ultimately, parents are their children’s greatest role models, but they are more difficult to train. Yet they should be informed about the great possibilities for women in of the digital sector to kindle their daughters’ interests.
The best devices in the world, if they are blind to gender bias, will reproduce the same old disadvantages for women. From this debate, difficulties, dangers and warnings have emerged. But there is a growing optimism in the awareness and search for solutions in businesses, governments and internationally, where a reflection on AI ethics is underway.
Christiane Féral-Schuhl is president of the French National Bar Association. 
Image: Tumisu, under Pixabay Licence.

SHARE

Author:

Picture of Christiane Féral-Schul
She is a barrister at the Paris and Québec Bars, a former President of the Paris Bar and is currently President of the National Bar Council until December 2020. For more than 30 years, she has practised in the fields of computer law and new technologies. Her legal firm FERAL-SCHUHL/SAINTE MARIE, co-founded with Bruno Grégoire Sainte Marie in 1988, has been a leading firm for many years.

Topics

SHARE

The SCLA is a non-governmental and non-profit organization that was established in 2019 in Geneva, Switzerland.

SCLA GENEVA HEADQUARTER: Rue Rodolphe-Toepffer 8, 1206 Genève, Switzerland / +41-(0)22-8860888

SCLA VIENNA:Nauschgasse 4/3/2, A198, 1220 Vienna, Austria/ +43-1-4420113

SCLA SHENZHEN: 6-112, Qianhai SZ-HK Youth Factory, Shenzhen / +86-755-83236806

CONNECT WITH SCLA

© 2024, SCLA. ALL RIGHTS RESERVED.

TCP/IP LICENSE YUE 2021119289