That just it, these “facts” won’t be on the web for stuff approximately 2005 and before. No where on the web is the racist and homophobic shit I was taught in the 80’s and 90’s listed on some wiki.
LLM’s are mostly useless anyways at distinguishing real information, they are just shit summary tools and poor search engines.
I would probably start out by proofing or approving them before they post to the site. It say I get a notification read it do a little reading over it and get to a point where I can use a large language model to siphon the submissions.
You’d probably need to verify all submissions
Unless you throw an LLM into the mix
Or maybe there’s already some resources giving you all debunked facts with their dates
You believe an LLM can be used to distinguish facts from fiction? I wonder up to which year that misconception was taught in school.
The whole point of LLMs is, to convince their users that the “facts” they generate are actual facts.
They can browse the web, and I never meant it would be 100 accurate just easier. Don’t think this is going to be a mission critical website
That just it, these “facts” won’t be on the web for stuff approximately 2005 and before. No where on the web is the racist and homophobic shit I was taught in the 80’s and 90’s listed on some wiki.
LLM’s are mostly useless anyways at distinguishing real information, they are just shit summary tools and poor search engines.
LLM is plausible deniability!
LLMs are not magic, otherwise one just have to request that any submission will have references to reputable sources.
I would probably start out by proofing or approving them before they post to the site. It say I get a notification read it do a little reading over it and get to a point where I can use a large language model to siphon the submissions.