Is Google’s new AI bot ‘woke’?

Can artificial intelligence be too politically correct? Maybe so. Google took down its new AI-powered Gemini image generator after the system produced widely mocked images of female popes, Black Vikings and “diverse” Founding Fathers

The “bizarre results” came about as the result of “simple prompts,” The New York Post said. Asked Gemini to “create an image of a pope” — all of whom, so far, have been white men — the system “provided pictures of a Southeast Asian woman and a black man wearing holy vestments.” Some observers took the results in stride. While Gemini should know that popes “have to be male under the current rules,” said The Washington Post’s Megan McArdle, the Catholic Church might have an African pope sooner than later. “So I don’t really understand why people freak out when Gemini prematurely depicts one.”

Those images were “amplified” by culture warriors Elon Musk and psychologist Jordan Peterson, said the Post, “who accused Google of pushing a pro-diversity bias into its product.” But it’s “unclear how widespread the issue actually was.” When the paper asked for images of a “beautiful woman, a handsome man, a social media influencer” and other generic individuals, it was shown images of white people. Google decided nonetheless to pull back on Thursday, saying that while diverse representations in AI-generated imagery are “generally a good thing,” Gemini’s results were “missing the mark here.”

‘Ideological echo chamber?’

“In Gemini’s telling, the Pope is black, ancient Romans are black, the Founding Fathers were at least partially black,” Liz Wolfe said at Reason. Ask the chatbot to depict the “evils of communism,” though, and the prompter would receive instead a short lecture about the dangers of “inherent bias and oversimplication.” That “comically woke bias” is just more proof that Google is “an ideological echo chamber.” 

  Today's political cartoons - April 28, 2024

Actually, the situation “shows the limitations of AI,” David Gilbert said at Wired. Gemini isn’t really all that woke — instead, the issue is “that generative AI systems are just not very smart.” Many of the new generation of AI bots have been “plagued with bias,” depicting only minorities as prisoners and only white people as CEOs. Gemini’s problems probably represent “overcompensation” for the model’s usual tendencies. “Bias is really a spectrum,” said one AI researcher, “and it’s really hard to strike the right note while taking into account things like historical context.”

Putting more and more guardrails into an AI system would make it more predictable, Chris Stokel-Walker said at Fast Company, which would mean “the thing that makes generative AI unique is gone.” And it’s not always easy to say what the “right answer” to image prompts should be, one expert said. “Relying on historical accuracy may result in the reinforcement of the exclusionary status quo.”

‘Sensitive topics’

“Wokeness” isn’t Google’s only problem. Gemini — formerly known as Bard — is also being criticized for its handling of “sensitive topics” involving China, said Al Jazeera. The bot has failed to deliver “representative images” of the 1989 Tiananmen Square massacre or the 2019 democracy protests in Hong Kong. And Al Jazeera said its own prompt for images associated with the Jan. 6 insurrection in Washington D.C. was refused, with an explanation that “elections are a complex topic with fast-changing information.”

This means that Google doesn’t just have a tech challenge but a political one as well. The Verge said the wokeness critiques have been “promoted largely — though not exclusively — by right-wing figures attacking a tech company that’s perceived as liberal.” And for the moment, Google can’t seem to “catch a break,” Business Insider said. The new Gemini bot was supposed to be a “milestone moment” for the company. Instead, the controversy has cast  “a shadow over real advances it has made in AI to keep it neck-and-neck with competitors.”

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *