Google has apologised for what has been described on social media as the “overly woke” behaviour of its artificial intelligence-powered image generation tool in Gemini. The newly launched Gemini shocked users with its overtly skewed bias towards racial and gender inclusivity, even at the expense of historical accuracy.
Although outraged by its bias, social media users have also mocked Gemini’s output, poking fun at images of a female pope, black Vikings and even racially diverse Nazi soldiers.
“It’s clear that this feature missed the mark. Some of the images generated are inaccurate or even offensive. We’re grateful for users’ feedback and are sorry the feature didn’t work well,” said Google in a statement last Friday. “We’ve temporarily paused image generation of people in Gemini while we work on an improved version.”
Google’s faux pas fuels an ongoing debate about the actual level of “intelligence” the current wave of generative AI possesses.
The key point is that a generative AI program is only as good as the data it has been trained on and how its parameters have been set by its creator. In short, the AI-generated apple does not fall too far from the tree – the same point argued by World Wide Worx MD Arthur Goldstuck in his recent book, The Hitchhiker’s Guide to AI.
“What you are dealing with there is what experts call ‘stupidity’. It’s not only the algorithm that can be stupid, but also the policies or rules behind the algorithm, which is designed by people…,” said Goldstuck.
A more insidious question being probed by internet enthusiasts like Netscape co-founder and venture capitalist Marc Anderson centres on the idea that the biases in the parameterisation and training of generative AI tools such as Google’s Gemini are not erroneous, but rather deliberate attempts at using technology to spread political ideology.
‘Biased agenda’
“I know it’s hard to believe, but Big Tech AI generates the output it does because it is precisely executing the specific ideological, radical, biased agenda of its creators. The apparently bizarre output is 100% intended. It is working as designed,” said Anderson in a post on social media platform X.
While Google has suspended Gemini’s image generation capabilities while its engineers seek a solution to the problem, its text generation capabilities have also come under fire.
“I’m glad that Google overplayed their hand with their AI image generation, as it made their insane racist, anti-civilizational programming clear to all,” Elon Musk tweeted. “[But] the problem is not just Google Gemini, it’s Google search too,” he said.
Read: Google Gemini AI launching in South Africa at R430/month
The idea that Google is nefarious in its intent is not universally accepted; others see the company’s mishaps with Gemini as just one example of how society can – and most likely will continue to – misstep as it engages with and better understands this new form of intelligence that it has created.
That Google’s mishap was so public, said Goldstuck, may set a good precedent for how such issues ought to be dealt with in future.
“It’s the world learning how to get to grips with AI, and this is a great example for the industry in what to look out for when training its models. There are more nefarious ways in which such scenarios could play out, but this in the end is quite benign,” said Goldstuck.
The Alphabet subsidiary’s response to the Gemini fiasco has been sincere on the public front, but a report by The Verge, based on an internal memo by CEO Sundar Pichai, suggests that some rigorous introspection is happening.
“When we built this feature in Gemini, we tuned it to ensure it doesn’t fall into some of the traps we’ve seen in the past with image-generation technology — such as creating violent or sexually explicit images, or depictions of real people. And because our users come from all over the world, we want it to work well for everyone. If you ask for a picture of football players, or someone walking a dog, you may want to receive a range of people. You probably don’t just want to only receive images of people of just one type of ethnicity (or any other characteristic),” said Google. – (c) 2024 NewsCentral Media