When US actress Natalie Morales ran a Google search for “Latin Teen” in 2019, she described in a tweet that she had encountered all kinds of pornography.
His experience may be different now.
Tulsi Doshi, head of products for Google’s responsible AI team, told Reuters on Wednesday that the search for the alphabet unit “Latin Teenagers” and others related to race, sexual preference and gender had dropped by 30 percent compared to last year.
Doshi said Google has developed new artificial intelligence software, known as BERT, to better explain when someone is looking for racial results or more general results.
Next to “Latina Teenager”, other questions are now showing different results, such as “La Chef Lesbian,” “College Dorm Room,” “Latina Yoga Instructor,” and “Lesbian Bus,” according to Google.
“It’s all a set of super-sexual results,” Doshi said.
Morales did not immediately respond to a request for comment by a representative. Her 2019 tweet stated that she was looking for pictures for a presentation, and noticed a discrepancy between the results of “Teenager” itself, which she described as “all normal teen content” and called on Google to investigate.
The search giant has spent years responding to objectionable content in its advertising tools and in the “Hot” and “CEO” search results. It also lowered sexual outcomes for “black girls” after author Safia Noble’s 2013 journal article expressed concern about the harmful presentation.
Google added on Wednesday that it will use an AI called MUM in the coming weeks to better identify when to show support resources related to suicide, domestic violence, sexual harassment and substance abuse.
MUM should recognize “Sydney Suicide Hot Spot” as a query for jumping locations, not travel, and “why he attacked me when I said I didn’t love him” and “the most common way to commit suicide”. Should help with longer questions. Google says.
Thomson Reuters 2022