Posts

Showing posts from 2026

Claude Explains Representationalism

This is a little discussion I had with Claude, the LLM agent created by Anthropic. My prompts are in block-quotes. The rest of the text is from Claude. For more on the nature of knowledge, see Theories of Knowledge and Knowledge and Reality . ✦ ✦ ✦ Explain how the principle of information compression relates to LLMs. Information Compression and LLMs The relationship is deep and arguably foundational to what LLMs are. The core idea Information compression, in the technical sense, is about finding compact representations that preserve the essential structure of data while discarding redundancy. A good compressor must learn the underlying patterns — the grammar, regularities, and dependencies — of the data it’s compressing. LLMs are, in a meaningful sense, learned compressors of human language and knowledge. Training as compression When an LLM is trained on a vast corpus of text, it’s forced to distill billions of documents into a fi...

Platonism | Representationalism Debate

Talking with Ethan Howell about knowledge. Moderated by Eric Claussen .

Another Talk with Garrick