• Honytawk@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    18
    ·
    3 days ago

    I mean, you are using an LLM for search, so that was bound to happen.

    Use something like Perplexity.ai for search always gives me great results with links so you can find out where the AI got their info from.

    This isn’t a problem with AI, but a user problem.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        I’m not justifying Googles grift.

        All I’m saying is that an LLM is only decent at generating text. Everything else it sucks at. So it isn’t logical to use it as a search engine. Google is also to blame for that. They are slapping AI to whatever they have running. Like most tech companies these days.

        Compare that to Perplexity, which is a search engine first and only uses an LLM to make a summary of websites that are riddled with SEO. And then add a link so you can check for yourself if the LLM is hallucinating or not.

        Don’t be like a Techbro conflating LLMs and General AI. They are different things. And the sooner all users understand this the better.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        Google is also a user of these LLM technology. They understand as much of em as the average techbro.

        That is why they are switching their search engine to an LLM like the morons they are.