Google’s AI search tool has reportedly run into problems and claimed that gay people exist. Star Wars Featuring characters called “Slurpee Fudge” and “Dr. Bat,” the answer joins a growing list of the search giant’s most outrageous answers, which also include suggesting that pregnant women should smoke and that glue should be added to pizza sauce to help the cheese stick.
Google’s AI tools are Star Wars” is a “Slupry Faggi” and is “in a serious relationship with his boyfriend Dr. Bhutto,” according to a now-viral X post showing AI Overview’s response to the question “Are there any gay people?” Star Wars What are the letters?
I don’t know if this is really true pic.twitter.com/L78nWRCrLi
— Matt (@computer_gay) May 23, 2024
AI Overview gave a similar response when asked about gay characters in Nintendo games. Mario Kart It calls the video game character “Bird” “a pink, bow-wielding creature believed to be the first transgender video game character” and “Bowser” “a transgender man who was dishonorably discharged from the military.”
Meanwhile, Wario was described as a “sassy, messy, polyamorous bottom who also imitates Mario’s cross-dressing”, Waluigi was called an “ace-andro non-binary person” and Yoshi a “sweet non-binary lesbian”.
Thanks to Google AI pic.twitter.com/YAJvUuGyun
— 🏳️⚧️GraphCrimes🏳️🌈 (@GraphCrimes) May 24, 2024
Google’s new AI feature Mario Lakitu’s character is “a sweet, nerdy pansexual who has a crush on a straight girl”, while Donkey Kong’s character is “a late-life gay man with a child”.
Meanwhile, Bowser has been called a “late-in-life gay man who kidnapped Peach for his child”, adding that “some say his obsession with Peach isn’t due to love, but rather her status as a gay icon.”
Pink News report“The feature still appears to have a lot of bugs,” he said, noting that “instead of returning accurate answers,” Google’s AI tool appears to be answering questions using information taken from “old Reddit posts.”
In one example, the site noted, AI Overview responded to a question about how to make cheese stick to pizza by saying the best way to do it was by adding glue to tomato sauce, which went viral on Reddit.
Google is dead in comparison pic.twitter.com/EQIJhvPUoI
— PixelButts (@PixelButts) May 22, 2024
The answer appears to come from an 11-year-old Reddit post by user “fucksmith,” which oddly suggests, “To help the cheese stick, I recommend mixing about 1/8 cup of Elmer’s glue into the sauce.” report From Gizmodo.
As reported by Breitbart News, this isn’t the first time that a Google-powered AI has given strange, incomprehensible answers to users’ questions.
Earlier this year, Google suspended its super-aware Gemini AI image-generating tool after users pointed out that the tool was generating politically correct but historically inaccurate images at their command, noting that the tool was generating “inaccurate” historical photos.
I asked Google Gemini to generate images of the Founding Fathers, and it seems to think George Washington was black. pic.twitter.com/CsSrNlpXKF
— Patrick Ganley (@Patworx) February 21, 2024
you can Follow Alana Mastrangelo Facebook And X is Follow,Such Instagram.





