floss.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
For people who care about, support, and build Free, Libre, and Open Source Software (FLOSS).

Administered by:

Server stats:

689
active users

Some thoughts about LLMs

1. "LLMs confidently give wrong answers/lies" - anybody who tried to get information from an LLM.

LLMs have no feelings or emotions. So, they cannot "feel confident". We perceive their lack of doubt to be confidence.

(1/3)

2. "The ability to speak doesn't make you intelligent" - from a recently popular toot.

The main ability of LLMs is to place token after token (a token is a part of a word) very quickly. To speak is to transform your thoughts into a form that others can understand. LLMs have no thoughts. Heck, they don't even understand words because they deal in tokens. Jar Jar Binks actually meets this definition of intelligence.

(2/3)

3. "When the LLM made up some stuff you didn't like, it was a hallucination" - Sam Altman probably?

LLMs cannot think. Also, they had no sense organs until they recently became multi-modal. They have no sense of right/wrong/meaning/nonsense etc. All of that comes from the human interpreting the signals coming from the LLM. Does this make their users computer astrologers? It doesn't matter. Like a stopped clock, they are sometimes right and can do novel things so people keep using them.

(3/3)

@alcinnz It's possible I've read some of @baldur 's writing last year and this influenced my understanding of LLMs. Also, the "stochastic parrots" paper.

I wasn't aware that Baldur wrote an entire book on this subject. "The Intelligence Illusion" is such a perfect title. That's what I would've called it too.