Close Menu
Pineapples Update –Pineapples Update –

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    People are using Chatgpt to write their text messages – here are how you can tell

    August 5, 2025

    Certain bug leaked in proton log fixes the totup secrets

    August 5, 2025

    This app immediately blocks sensitive information from your MAC screenshot.

    August 5, 2025
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Pineapples Update –Pineapples Update –
    • Home
    • Gaming
    • Gadgets
    • Startups
    • Security
    • How-To
    • AI/ML
    • Apps
    • Web3
    Pineapples Update –Pineapples Update –
    Home»Apps»Google’s latest fruitless observation results describe another problem with AI
    Apps

    Google’s latest fruitless observation results describe another problem with AI

    PineapplesUpdateBy PineapplesUpdateApril 27, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Google’s latest fruitless observation results describe another problem with AI
    Share
    Facebook Twitter LinkedIn Pinterest Email


    You may not be familiar with the phrase “Peanut Butter Platform Plums”, but it clearly arises from a scientific experiment, where peanut butter was converted into a diamond-like structure, under very high pressure-hence “heels” reference.

    Except this has never happened. The phrase is completely nonsense, but when asked by the author Meghan Wilson-Anastasios, a definition and backstory was given by Google AI overview This thread post (Which includes some other recreational examples).

    The Internet picked it up and ran away with it. Apparently, “you can’t lick a beger twice” means that you can’t do anyone twice (Blue sky), “A loose dog will not surf” means that something is unlikely to happen (Wire), And “Cycle Eats FursFuturism,

    Google, however, is not happy. I was eager to keep my own collection of fuck phrases and clear meanings together, but it seems that the trick is no longer possible: Google will now refuse to show an AI observation or tell you that you are wrong if you try and get clarification of a fruitless phrase.

    If you go to a real AI chatbot, it is a bit different. I conducted some quick tests with Gemini, Cloud, and Chatgipt, and bots try to explain these phrases logically, while also say that they seem meaningless, and do not look in general use. This is a much more fine approach, which lacks AI interviews with reference.

    Someone on threads saw that you can type any random sentence in Google, then add “Earth” later, and you will find an AI explanation of a famous idiom or phrase that you have just created. Here is my (image or embed)

    – Greg genner (@gregjenner.bsky.social, 23 April 2025 at 11:15 pm

    Now, the AI ​​overview is still labeled as “experimental”, but most people will not pay much attention to it. They will assume that the information they are looking at is accurate and reliable, is created on the information scrap from web articles.

    And while Google’s engineers may have prepared for this special type of mistake, much like glue on pizza last year, it will probably not be long before another issue. It speaks for some basic problems with all our information from AI rather than references written by real humans.

    What’s going on?

    Fundamentally, these AI interviews are designed to provide answers and synthesize information, even if there are no accurate matches for your query-which this phrase-transfusion problem begins. The AI ​​feature is also not reliable information on the Internet and not even the best judge.

    Looking for a laptop problem fixing? First you will find a list of blue links from reddit and various support forums (and perhaps lifehacker), but with AI overview, Google sucks everything that can find on those links and tries to patch a smart answer together – even if no one has a specific problem you asked. Sometimes it can be helpful, and sometimes you can finish Make your problems worse,

    What do you think so far?

    Google’s latest fruitless observation results describe another problem with AI


    Credit: Lifehacker

    In fact, I have also seen that AI bots have a tendency to agree with signs, and confirm what a sign says, even if it is wrong. These models are eager to please, and want to be helpful, even if they cannot be. It depends on how you give words to your query, you can get AI to agree with something that is not right.

    I did not manage to achieve any fruitless idioms defined by Google AI overview, but I asked AI why the second album of REM was recorded in London: This manufacturer who was below the boy’s choice, AI observation told me. But in fact, RAM’s second album was not recorded in London, it was recorded in Northern Carolina – this is the third LP that was recorded in London, and which was manufactured by Boyd.

    The actual Gemini app gives correct response: that the second album was not recorded in London. But the way the AI ​​overview tries to combine many online sources in a consistent whole, it seems suspicious in terms of its accuracy, especially if your discovery query makes some of its confident claims.

    Google ai overview

    With the right encouragement, Google will get his music chronology wrong.
    Credit: Lifehacker

    “When people search fruitless or ‘wrong base’, our systems will try to find the most relevant results based on the limited web content,” Google said Android authority In an official statement. “This is true of overall discovery, and in some cases, the AI ​​interview will also trigger in an attempt to provide useful references.”

    We feel that there is barreling towards having a search engine that always reacts with AI instead of the information compiled by real people, but certainly AI has never fixed a tap, has tested an iPhone camera, or hearing REM – it is just a possibility of going to the last one.

    describe fruitless Googles latest observation problem results
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleThis is finally time to upgrade Windows 11
    Next Article This difficult interview question
    PineapplesUpdate
    • Website

    Related Posts

    Apps

    This app immediately blocks sensitive information from your MAC screenshot.

    August 5, 2025
    Apps

    Here are 6 sites that require age verification – will you be affected?

    August 5, 2025
    Apps

    Utorrent Torrent Client Review | Tekardar

    August 4, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Microsoft’s new text editor is a VIM and Nano option

    May 19, 2025797 Views

    The best luxury car for buyers for the first time in 2025

    May 19, 2025724 Views

    Massives Datenleck in Cloud-Spichenn | CSO online

    May 19, 2025650 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    10,000 steps or Japanese walk? We ask experts if you should walk ahead or fast

    June 16, 20250 Views

    FIFA Club World Cup Soccer: Stream Palmirus vs. Porto lives from anywhere

    June 16, 20250 Views

    What do chatbott is careful about punctuation? I tested it with chat, Gemini and Cloud

    June 16, 20250 Views
    Our Picks

    People are using Chatgpt to write their text messages – here are how you can tell

    August 5, 2025

    Certain bug leaked in proton log fixes the totup secrets

    August 5, 2025

    This app immediately blocks sensitive information from your MAC screenshot.

    August 5, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms And Conditions
    • Disclaimer
    © 2025 PineapplesUpdate. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.