THE PRACTICE OF SHARING KNOWLEDGE
DECONSTRUCTING GENDER BIAS
DECONSTRUCTING GENDER BIAS
gender bias
gender stereotypes
AI machine learning
word embedding
neutralized data
language
“The potential wide-ranging impact makes it necessary to look carefully
at the ways in which these technologies are being applied now, whom
they’re benefiting, and how they’re structuring our social, economic,
and interpersonal lives.”
Ryan Calo: co-director, Tech Policy Lab, associate professor, UW Law School.
The issue of gender bias is increasingly becoming a subject of discussion and reflection. According
to Kate Crawford, it is important to move towards its neutralization.“And I think this increase in interest
is completely justified, basically because machine learning systems are starting to impact millions
of people every day. Therefore, bias is important. So, bias matters” — Kate Crawford, AI Now: Social
and Political Questions for Artificial Intelligence, 2018.
For that, it is important to overcome a long history of discrimination, as it is recognized and reflected
by the systems we build: “bias in systems is most commonly caused by bias in training data,
and we can only gather data about the world that we have”, as Crawford said. It is necessary to think about gender bias in Artificial Intelligence as a major challenge, because if we seek for quick fixes,
we’re not only going to miss the bigger problem, but we could actually start to make things a lot worse.
Thus, we intend to explore gender bias based on the social context, applied to Artificial Intelligence.
As long as the issue is not resolved in the social context, it will continue to plague technical systems.
Main Purpose
Our research project aims to promote an informed reflection
on the issue of gender bias today, and what implications it may
have in the construction of systems, machine learning and AI.
>> Visit the website <<