The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
Lol, no. Anyone who believes this have no idea what it means to believe in conspiracy theories. Trust me, it’s not that they can’t find the Wikipedia page with the official information, or can’t turn on the TV and listen to what the news says.
It’s that they have no trust in those things. Chat gpt won’t change that.
Have you even read the article?
No just the headline. Article was good?
Does the article say the headline is wrong? Or does it say conspiracy theorists listen to facts because it relies on a handful of willing participants who changed their mind when seeing facts and reports? Because that’s not the crux of the crazy conspiracy theorists.
Try again when the chatbot talked to the likes of Graham Hancock or the hardcore MAGA death cult. Facts don’t matter.
Just look at this guy who straight up pretends that no one tried to talk to them before.
It does talk about gish gallop at the very end, and claims that the chatbot can keep presenting arguments - but doesn’t actually say that it has worked.