What does “you can’t lick a badger twice”?
Like many English words – “A bird in the hand is worth two in the bush”, “a surveillance pot never bubbles” – this is not even true. Frankly, nothing prevents you from licking a badger as often as you wish, even if I do not recommend it.
(I am sure that lawyers for business insider would like me to insist that you show caution when meeting fauna, and that we cannot be held responsible for rabies infections.)
If the sentence does not sound, it is because, unlike “sounds a bell”, it is not in fact a real saying – or idiom – in English.
But the preview of Google’s AI certainly thinks that it is real and will gladly give you a detailed response of what the sentence means.
Greg Jenner, British historian and podcaster, saw people talk about this phenomenon on the threads and wanted to try it himself with an invented idiom. The badger’s sentence “has just jumped into my head,” he told Business Insider. Her Google research spat an answer that seemed reasonable.
I wanted to try this myself, so I invented a few false sentences – like “you cannot put a duck in a pencil” – and I added “meaning” to my research request.
Google took me seriously and explained:
“You can’t put a duck in a pencil.” Initiate of Business
So I tried others, like “the road is full of salsa”. (The one I would like to see being used in real life, personally.)
A google spokesperson told me, basically, that her AI systems were doing their best to give you what you want – but that when people deliberately try to play games, sometimes AI cannot follow exactly.
“When people do absurd research or” false premises “, our systems will try to find the most relevant results based on available web content,” said spokesperson Meghann Farnsworth.
“This is true for research overall – and in some cases, the AI glimps will also be triggered in order to provide a useful context.”
“The road is full of salsa.” Initiate of Business
Basically, AI’s overviews are not perfect (DUH), and these false idioms are “false” research which is deliberately intended to trip it (fairly fair).
Google is trying to limit AI’s overviews to respond to things that are “data empty”, that is to say when there are no good web results to a question.
But clearly, it doesn’t always work.
I have some ideas about what’s going on here – some are good and useful, some are not. As we could even say, it is a mixed bag.
But first, another invented sentence that Google tried to find a meaning for: “Do not kiss the door handle.” Said an overview of Google’s AI:
“Do not kiss the door handle.” Initiate of Business
So what’s going on here?
Good:
The Englishman is full of idioms like “kick the bucket” or “piece of cake”. These can be confusing if English is not your first language (and frankly, they are often confused for native speakers too). My case is that the sentence is generally disturbed as “example and point”.
It is therefore very logical that people often google to understand the meaning of a sentence they have encountered that they do not understand. And in theory, this is a great use for the responses to the UA overview: you want to see the answer simply declared right away, and not click on a link.
The bad:
IA should Be really good in this special thing. The LLMs are formed on large quantities of the English written language – books of books, websites, YouTube transcriptions, etc., therefore being able to recognize idioms is something that they should be very good to do.
The fact that it makes mistakes here is not ideal. What is wrong that the preview of Google’s AI does not give the real answer, which is “it is not a sentence, silly”? Is it just a classic hallucination?
The ugly:
Compared, Chatgpt gave a better answer when I questioned him on the badger sentence. He told me that it was not a standard English idiom, even if he had the vaguely folk sound of one. Then he offered: “If we Treat it as a real idiom (for pleasure), “and gave a possible definition.
So this is not a problem through all AI – It seems to be a Google problem.
Can’t lick a badger twice? Reuters / Russell Cheyne
This is somewhat different from the fiasco of the responses to the Google AI preview of last year when the results have drawn information information like Reddit without considering the sarcasm – Remember when he suggested that people should eat rocks for minerals or put glue in their pizza (someone on Reddit had already joked on pizza glue, which seems to be from).
These are all very low issues and an idiotic pleasure, constituting false sentences, but it testifies to the bigger and uglier problems with the AI increasingly tangled in the way we use the Internet. This means that research on Google is somehow worse, and as people are starting to count on them, this bad information only gets into the world and considering a fact.
Of course, research on AI will go better and more precise, but what growing pain will we last while we are in this intermediate phase of an internet a little wobbly and a little waste and filled with rooms?
AI is there, it already changes our lives. There is no return, the horse left the barn. Or as they say, you cannot lick a badger twice.
businessinsider