GOOGLE APOLOGIZES FOR AI MISHAPS IN NEW IMAGE-GENERATOR
Tech giant Google found itself in hot water after the rollout of a new artificial intelligence image-generator left users scratching their heads. The tool, known as Gemini, was intended to produce a diverse range of people in images based on written prompts but ended up generating some questionable and offensive content.
Some social media users raised concerns about an anti-white bias in the way the tool created racially diverse images, prompting Google to issue an apology for the mishap. Senior Vice President Prabhakar Raghavan acknowledged that the feature had missed the mark and that some of the images it generated were inaccurate and even offensive.
The images that particularly drew attention included a Black woman as a U.S. founding father and Black and Asian people as Nazi-era German soldiers. The lack of accuracy and sensitivity in the tool’s output raised questions about the potential for misuse and the perpetuation of harmful stereotypes.
Google’s decision not to release a public demo of the underlying code for Imagen 2, the research experiment on which Gemini was built, was rooted in concerns about social and cultural exclusion and bias. However, the competitive pressure to release generative AI products in response to growing interest in the technology has pushed companies to launch such tools with minimal testing.
In response to the backlash, Google has halted Gemini’s generation of images with people in them and promised to conduct extensive testing before restoring this feature. However, the company’s track record of producing accurate and unoffensive results has raised questions about its accountability in upholding ethical standards in AI development.
What do you think about Google’s mishaps with its AI image-generator? Do you believe tech companies should be held to higher standards when it comes to creating ethical and accurate AI technologies? Share your thoughts in the comments below!
IntelliPrompt curated this article: Read the full story at the original source by clicking here