Language may be less about transmitting more, & more about saying just enough to be understood.

Why do very different languages seem to converge on a similar rate of communication? A common answer is that humans are hitting a hard cognitive ceiling. Our new paper argues something subtler: language is not optimized for maximum throughput, but for shared understanding under real-world conditions of noise, ambiguity, memory, and repair.

We Speak Through Shared Worlds: A Rate–Distortion View of Human Language proposes that communication works inside a context-adaptive regime of collaborative compression. Shared history, expertise, culture, and common ground are not just background to communication, they are part of its compression machinery.

Read the paper.

Browse the Deck

Run the Simulation

Watch the video as you look through Deck

Next
Next

We Know How to Make AI Safer. So Why Don't We?