Artificial Intelligence Has a Bias Problem

From the way that airbags are designed in cars, to the lack of women included in medical research, we see the ways in which gender bias in these fields has negatively impacted women. 

Gender bias is especially problematic in fields that  are  overwhelmingly made up of men,  including computer programming.  In our increasingly  digitalized  and technological world, we see  gender bias  playing out in another field our society that we are heavily  reliant upon, the world of IT.   AI and computer programs which run on algorithms  surround  us, and  as it turns out, many of them contain the gender biases and other biases we as humans have.  

AI’s Gender Bias 

AI and computer programs  are posed as  being advantageous  to humans because they don’t contain the flaws that we as humans  have.  This includes things  such as bias and being irrationally influenced by  emotions.  However,  what happens when the people creating the programs  unintentionally include their own biases they may contain?  The consequences of this have been documented in  many different examples.  

Amazon’s Facial Recognition Program 

One  of these examples which received considerable attention was Amazon’s facial recognition program.   In research that was carried out by MIT and Standford University, they found that  Amazon’s facial recognition program misidentified white men less than 1%  of the time.  What was the rate of misidentification among black women with the same program?   Nearly 35%.    

This was highly problematic, especially  considering the fact that  police departments around the US were  making use of Amazon’s facial recognition programs.  It should be noted that Amazon has stopped sharing its program with police forces  for now  in the wake of protests and pushbacks after the killing of George Floyd.      

We Rely on Machine Learning 

AI and machine learning programs  are crucial in our day to day lives.  Without going into depth,  machine learning  is the process of  a computer algorithm analyzing massive amounts of  data,  and  learning what types of recommendations and decisions to make based off  this  data.   Facebook, Twitter, Google, and about every other major app we use in our day to day lives  rely on machine learning.  

Yet a recent investigation by the data annotation firm  Scale AI  found a  large  gap in a key  machine learning program.   Their investigation  looked into  an open-source dataset known as CoNLL-2003, which has been crucial in the development of machine learning and AI programs.  What  was the problem that  they found?  Many women’s names were missing from the dataset.    

Women’s Names Missing  

Male names were found to be mentioned five times more often than female ones.  It was also 5% more likely to miss a new woman’s name that was run through the program than a man’s name.  

One of the things that makes this problematic is the fact that CoNNL-2003 has been crucial in the development of other language programs which use AI.  In fact, the program has been cited 2,500 times in research literature, which makes the implications of these findings hard to pin down exactly.  What is clear though, is that male bias is being perpetuated in computer programs which rely on CoNNL-2003.  

Unintentional Bias 

Those who helped develop Amazon’s facial recognition program, and the CoNNL-2003 dataset did not intentionally create gender and racially biased programs.  Yet it happened anyways, and these are the types of things that happen when you have a group of similar people and backgrounds working on a project.   Unfortunately  in this case, it was primarily white men behind these programs, leaving many other groups to face  the  consequences for their  unintentional bias.  

We Need Diversity 

This underscores  and highlights a critical point.  The need for people of diverse backgrounds and experiences working together in a variety of fields, from computer programming, to  venture capital.   

Not only does this work help address these types of biases  occurring  and prevent them from happening, but it has also shown to improve outcomes.  From having a  more successful and profitable company, to creating more secure and  long-lasting  peace agreements, having a diverse group of people as part of the decision process leads to more successful outcomes.    

We Are Seeing Progress 

Luckily,  we see business leaders,  activists, and  community groups coming together to try and get people of diverse backgrounds involved in different fields.  This is also true in the field of computer programming.   Girls Who Code  is a global organization that tries to close the gender gap for computer programmers and create the next generation of woman engineers and computer  programmers.   This is a growing movement, and you can find even more organizations  here  who are doing similar and important work.    

Although these problems still exist, we see an increased awareness and commitment to address these different gender biases and gaps.   At SHE Community we work to support companies and businesses to create diverse and balanced organizations.   Not only does this help create thriving businesses, but a thriving world. • 

Author

  • Kelly Fisher

    Kelly has a master's degree in Gender Studies from University of Oslo, with a specialization within male masculinity. He has has passion for engaging men in the subject of equality.

This website uses cookies to ensure you get the best experience on our website.

Learn more.

%d bloggers like this: