Skip to main content
3 events
when toggle format what by license comment
Sep 16 at 2:22 comment added forest distrusts StackExchange @JourneymanGeek Although there are a few LLMs that have "reasoning" ability that somehow involves double-checking certain claims, in this case it's simply trying to emulate a human response, along with the uncertainty. It has no idea what it's saying and has neither the intention nor capability to follow up on its own accord.
Sep 16 at 1:10 comment added Journeyman Geek With the last one... I wonder how exactly does a computer double check things?
Sep 15 at 22:10 history answered forest distrusts StackExchange CC BY-SA 4.0