Join Nostr
2026-01-30 06:32:27 UTC
in reply to

Zsubmariner on Nostr: 2/3 There is a deeper information theoretic problem with how a lot of thinking about ...

2/3

There is a deeper information theoretic problem with how a lot of thinking about how so will operate that people miss. (Including experts, because it's a really fundamental information theory thing that touches on metaphysics.)

I worked with neural networks and other machine learning models for nearly a decade. I operated a system with thousands of small natural language models in production for several years. There is something called "model collapse" that kills all closed loops.

Models need fresh input from living intelligence and can't be trained on their own output, or they collapse. It's a law. Some don't agree because they want closed loops to sustain, but they can't ever do it because they are wrong, so they just cope.

These models produce no actual intelligence (and the name “AI” is an oxymoron) because intelligence is an emanation of new information from a living thing, not a dead statistical process. Attempts to remove humanity from the loop won’t just get stuck at average—they will utterly collapse into homogenized, error-amplifying mush.

The modeling is actually just a lossy compression. No magic involved. A xerox of a xerox only gets worse, and it's exponential. Like I try to tell the quantum people, thermodynamics always wins.

(Isn't it funny that fiat, quantum and AGI are all based on the same broken metaphysics and claims that they are going to defeat entropy somehow? But here I am trashing modernism again.)