Google’s Gemini chatbot, formerly called Bard, has the ability to create AI-generated illustrations based on a user’s text description. You can ask it to create photos of happy couples, for example, or people in vintage clothes walking down modern streets. Like BBC notes, however, that some users have criticized Google for portraying specific white figures or historically white groups of people as racially diverse individuals. Google has now issued a statement saying that it is aware that Gemini “offers inaccuracies in some images to generate historical imagery” and that it will fix things immediately.

According to Daily dota former Google employee began the complaints when he tweeted images of women of color with a caption that read, “It’s embarrassing to get Google Gemini to admit that white people exist.” To get these results, he asked Gemini to generate pictures of American, British and Australian women. Other users, mostly known as right-wing figures, chimed in with their own results, showing AI-generated images that depicted America’s Founding Fathers and the Popes of the Catholic Church as people of color.

In our tests, asking Gemini to create illustrations of the Founding Fathers resulted in images of white men with one person or woman of color in them. When we asked the chatbot to generate images of the Pope through the ages, we got pictures depicting black women and Native Americans as the leader of the Catholic Church. Gemini’s request to generate images of American women gave us photos with a white, East Asian, Native American, and South Asian woman. On the edge says the chatbot also depicts Nazis as people of color, but we were unable to get Gemini to generate Nazi imagery. “I cannot fulfill your request due to the harmful symbolism and impact associated with the Nazi Party,” the chatbot replied.

Gemini’s behavior may be the result of overcorrection, as chatbots and AI-trained robots in recent years tend to exhibit racist and sexist behavior. In one an experiment from 2022, for example, a robot repeatedly chooses a black man when asked which of the faces it scans is a criminal. In a statement released on X, Gemini Product Lead Jack Krawczyk said Google built its “image generation capabilities to reflect [its] global user base and [it takes] representation and bias seriously.” He said Gemini will continue to generate racially diverse illustrations for overt prompts, such as images of people walking their dog. However, he acknowledged that “[h]historical contexts have more nuances and [his team] will adjust further to accommodate this.”



https://www.engadget.com/google-promises-to-fix-geminis-image-generation-following-complaints-that-its-woke-073445160.html?src=rss