Feminism
STEM
Technology
Work and Organizations

Gender and fairness in machine learning and robotics

“Gendered Innovations,” said current Clayman Institute Faculty Research Fellow Londa Schiebinger, “is about fixing the numbers of women, fixing the institution, and fixing knowledge.” 

According to Schiebinger, the John L. Hinds Professor of History of Science at Stanford University and director of Gendered Innovations in Science, Health & Medicine, Engineering, and Environment, Gendered Innovations aims to “develop state-of-the-art methods of sex and gender analysis” and “provide case studies to illustrate how gender analysis leads to discovery and innovation.” She presented her latest research on Gendered Innovations—originally, an initiative incubated at the Clayman Institute while Schiebinger served as the Barbara D. Finberg Director from 2004-2010—at her fall Faculty Research Fellow presentation, titled “Gender and Fairness in Machine Learning and Robotics.”

Since its incubation, Gendered Innovations has grown into an impressive international collaborative project spanning Europe, Canada, the US, South Korea, and now South Africa. In her presentation, Schiebinger posed the question that has guided Gendered Innovations since its inception: “Can we harness the creative power of sex & gender analysis for discovery and innovation?” She turned to an examination of biases in machine learning to explore this question at the convening of fellows. Schiebinger pointed out, for example, that despite the belief that algorithms are value neutral, these automated processes can acquire human biases. When trained on historical data, for example, systems inherit unconscious gender bias from the past in ways that can amplify gender inequality in the future—even when governments, universities, and companies themselves have implemented policies to foster equality “Machine learning captures embedded gendered associations and often the conscious and unconscious bias embodied in them,” Schiebinger said.

In addition to analyzing machine learning, Schiebinger and her group will also discuss the gendering of robots. She observed how cultural assumptions can be transferred to robots as well as virtual assistants in ways that overlay normative gender expectation onto machines. Users, for example, might consider a dating app most effective when programed with a female voice while considering a math tutoring program best when programed with a male voice. 

Gendered Innovations is devoted to positive solutions, such as the “hard de-biasing” of these technologies. Schiebinger gave the example of changes at Google that show how gendered analyses can change digital culture: In its early days, Google Translate unconsciously defalted to the masculine pronoun. As Google engineers became aware of this bias, they were able to move toward more gender inclusive language.

For Schiebinger, then, the digital age has not given us a “post-gender,” “post-human” world but rather one that is full of humanity—in both its good and problematic aspects. More than ever, she believes, there is a critical need to examine how gender influences our lives, and how sex and gender analysis can be deployed to build a  more equitable future.