Experts Discuss Algorithmic Bias in School of Public Policy Panel

Posted October 20, 2021

On Sept. 30, Georgia Tech’s School of Public Policy and the Ethics, Technology, and Human Interaction Center (ETHICx) hosted a panel on emancipatory data science. Four scholars spoke about their research handling biases in algorithms. They also discussed how data, often perceived as objective, is still influenced by the ideologies of those who create and work with it.

“Algorithmic bias from data collection and data use disproportionately affect minoritized groups,” said Cassidy Sugimoto, Tom and Marie Patton Chair of the School of Public Policy. “A key area of concern is how science policy and diversification of the scientific workforce can mitigate these negative consequences.”

The event was moderated by Justin Biddle, associate professor in the School of Public Policy and co-director of ETHICx. The panelists were Thema Monroe-White, assistant professor of management at Berry College; C. Malik Boykin, assistant professor of cognitive, linguistic, and psychological sciences at Brown University; Fay Cobb Payton, professor of information technology and business analytics at North Carolina State University; and Timnit Gebru, independent scholar and co-founder of Black in AI.

Monroe-White, who helped organize the discussion, opened the panel by presenting her definition of emancipatory data science. She defined it as data work where members of minoritized, marginalized, and vulnerable communities are no longer the “object” of data science, but rather the “subject” involved in its collection, analysis, interpretation, and communication.

Monroe-White supported her proposal with examples of how racial bias has bled into science and data throughout history. For instance, in 2016, ProPublica shocked many with a study saying that commercially-used recidivism risk models were twice as likely to misclassify Black offenders as at risk of re-offending as white offenders.

“It wasn’t a shock to me to see this,” said Monroe-White. “It was a continuation of a legacy of Black history in America.”

Next, Boykin presented on research with Sarah Brown, an assistant professor at the University of Rhode Island. Together, Boykin, a psychologist, and Brown, a computer scientist, are working to put together a more equitable definition of algorithmic fairness. Many existing definitions of what makes an algorithm fair are mutually exclusive, so programmers struggle to meet one definition without violating another.

Boykin also pointed out that people of color need to be included in the decision-making process. Without doing so, Boykin said, we risk “dehumanization by omission,” or the process by which people who are most affected by data science don’t have a place in its collection or analysis and only exist in the work as statistics.

Payton spoke about working with data and healthcare. She discussed a study she’d worked on measuring comorbid conditions that may impact emergency-room visits by people with and without diabetes, especially as the comorbidities relate to race and gender. The North Carolina State University scholar and her team ultimately found significant, persistent disparities in healthcare outcomes for women and ethnic minorities. This is because often, when members of these marginalized groups visit hospitals, they are only treated for the presenting problem, if not questioned as to the validity of their concerns to begin with. Therefore, they don’t receive the quality of care necessary to prevent further complications and hospitalizations down the line.

“Particularly when Black and brown patients enter the hospital, there’s an emphasis on one issue, not the total picture of what might be going on,” she said.

Finally, Gebru discussed her experiences working in artificial intelligence research as a Black woman. She told the story of a colleague who’d called out a company’s harmful facial recognition software, and the company immediately turned around and tried to discredit her work. Gebru stepped in and defended her colleague with a point-by-point rebuttal of the company’s statement. Ultimately, the company ceded that the software needed to be redone, but without giving credit to Gebru’s colleague.

When answering audience questions, all of the panelists agreed that data needs to be communicated in a way that’s accessible to everyone. When asked how best to do this, Boykin talked about being a hip-hop artist and how he connects with others — especially young people — through his music. Payton suggested communicating with the people who will be affected by research by including them in every step of the process, not just attaching their names at the end of the study.

Finally, the panelists all shared struggles of trying to get their research published due to its controversial nature. However, they also agreed that while what they write isn’t usually too well received, that’s what makes it important.

“Every paper that I think has been worth it has been a struggle,” Gebru said. “If there’s no venue to publish it in, we have to create one to amplify our voices.”

Related Media

Left to right: Thema Monroe-White, C. Malik Boykin, Fay Cobb Payton, Timnit Gebru

Contact For More Information

Grace Wyner

Communications Officer

School of Public Policy | Sam Nunn School of International Affairs