Business

Google’s Brilliant New AI Gave Wrong Information in Promotional Video

But Google’s video on Tuesday shows one of AI’s biggest pitfalls: bad advice, and not just bad advice. A minute into the flashy and fast-paced video, Gemini AI in Google Search presented a factual error first spotted by The Verge.

A photographer takes a video of his malfunctioning film camera and asks Gemini, “Why doesn’t the lever move all the way.” Gemini immediately offers a list of solutions, including one that would destroy all of his photos.

The listing video highlights a suggestion: “Open the back door and carefully remove the film if the camera is stuck.”

Professional photographers – or anyone who has used a film camera – know that this is a terrible idea. Opening a camera outside, where the video is taking place, could ruin all or part of the film by exposing it to bright light.


Screenshot of Gemini in the Search demo video.

Screenshot of Gemini in the Search demo video.

Google



Google has faced similar problems with previous AI products.

Last year, a Google demo video showing the Bard chatbot incorrectly stated that the James Webb Space Telescope was the first to image a planet outside our own solar system.

Earlier this year, chatbot Gemini was criticized for refusing to produce photos of white people. He was criticized for being too “woke” and developing photos riddled with historical inaccuracies like those of Asian Nazis and black founding fathers. Google executives apologized, saying they “missed the point.”

Tuesday’s video highlights the dangers of AI chatbots, which produce hallucinations, make incorrect predictions, and give bad advice to users. Last year, users of Bing, Microsoft’s AI chatbot, reported strange interactions with the bot. He called users delusional, tried to explain to them what year it is, and even declared his love for some users.

Companies using such AI tools may also be legally responsible for what their bots say. In February, a Canadian court held Air Canada responsible because its chatbot provided a passenger with incorrect information about bereavement discounts.

Google did not immediately respond to a request for comment sent outside of normal business hours.

businessinsider

Back to top button