boem@lemmy.world to Technology@lemmy.worldEnglish · 10 months agoResearchers confirm what we already knew: Google results really are getting worsewww.theregister.comexternal-linkmessage-square101fedilinkarrow-up1685arrow-down112cross-posted to: degoogle@lemmy.mlhackernews@derp.foo
arrow-up1673arrow-down1external-linkResearchers confirm what we already knew: Google results really are getting worsewww.theregister.comboem@lemmy.world to Technology@lemmy.worldEnglish · 10 months agomessage-square101fedilinkcross-posted to: degoogle@lemmy.mlhackernews@derp.foo
minus-squareLmaydev@programming.devlinkfedilinkEnglisharrow-up11arrow-down2·10 months agoThe newer ones search the internet and generate from the results not their training and provide sources. So that’s not such a worry now. Anyone who used ChatGPT for information and not text generation was always using it wrong.
minus-squareBakerBagel@midwest.sociallinkfedilinkEnglisharrow-up15arrow-down2·10 months agoExcept people are using LLM to generate web pages on something to get clicks. Which means LLM’s are training off of information generated by other LLM’s. It’s an ouroboros of fake information.
minus-squareLmaydev@programming.devlinkfedilinkEnglisharrow-up3·edit-210 months agoBut again if you use LLMs ability to understand and generate text via a search engine that doesn’t matter. LLMs are not supposed to give factual answers. That’s not their purpose at all.
The newer ones search the internet and generate from the results not their training and provide sources.
So that’s not such a worry now.
Anyone who used ChatGPT for information and not text generation was always using it wrong.
Except people are using LLM to generate web pages on something to get clicks. Which means LLM’s are training off of information generated by other LLM’s. It’s an ouroboros of fake information.
But again if you use LLMs ability to understand and generate text via a search engine that doesn’t matter.
LLMs are not supposed to give factual answers. That’s not their purpose at all.