With advancements in technology, people are eager to eliminate any sense of mistreatment or oppression in the tech world. That's why a lot of people are particularly interested in applying machine learning to eliminate bias. Google’s word2vec is one example. Google researchers extracted patterns of words that are related to each other. By representing the terms in a vector space, they were able to deduct relationships between words with simple vector algebra. Some of the results proved rather sexist. For “man: computer programmer :: woman:?” it will give you “homemaker.” This does not mean that machine learning is sexist. It does showcase, however, the bias that still exists in our journalism and journalists today. Statistically, the statements are correct using just what can be derived from the articles. But the articles themselves are obviously biased.

Article source: https://techcrunch.com/2016/10/11/is-machine-learning-sexist/

Photo source: By Tetra Pak (http://www.flickr.com/photos/tetrapak/5956902687/) [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons