After building AI infrastructure for years, I’ve developed a simple mental model for intelligence.

Intelligence is compression.

It’s the ability to take a vast, chaotic field of information and reduce it to a small number of principles that actually explain what’s happening. The better the compression, the deeper the understanding.

This is true for machines. It’s also true for people.

What AI Models Actually Do

Strip away the hype and an AI model is fundamentally a compression engine. You feed it terabytes of data, and it distills that into a set of weights — patterns, relationships, statistical regularities — that can generate useful outputs from new inputs.

The quality of the model is the quality of its compression. A model that memorizes its training data hasn’t learned anything. A model that extracts transferable patterns has.

This is exactly how expertise works in humans.

A junior engineer sees a production incident and sees chaos — logs everywhere, metrics spiking, alerts firing. A senior engineer looks at the same data and sees a pattern they’ve seen before, compressed into a diagnosis: “the connection pool is exhausted because that service’s health check is holding connections open.”

Same data. Different compression. Wildly different outcomes.

Compression as a Design Principle

This mental model changes how I approach system design.

Good architecture is compressed complexity. A well-designed system takes a complex problem and presents a simple interface. The complexity doesn’t disappear — it gets organized, contained, and hidden behind clean abstractions. That’s compression.

Good documentation compresses understanding. The best technical documents don’t describe every detail. They identify the three or four things you need to understand to reason about the system effectively. Everything else follows from those core ideas.

Good monitoring compresses signals. Nobody needs a dashboard with 200 metrics. You need 5 metrics that tell you whether the system is healthy, and clear paths to drill deeper when it’s not.

The Anti-Pattern

The opposite of compression is sprawl. And sprawl is the default state of every system that isn’t actively maintained.

  • Microservices that multiply without clear boundaries
  • Configuration that scatters across twenty different files and environment variables
  • Knowledge that lives in one person’s head instead of in the architecture itself

Sprawl is what happens when you add without compressing. When you solve each new problem in isolation instead of recognizing the pattern underneath.

The Practice

Every week, I try to compress something. Take a complex process and reduce it. Take a sprawling system and find its essential structure. Take an explanation that requires thirty minutes and distill it to three sentences.

Not because brevity is inherently good. But because compression forces understanding. You can’t simplify what you don’t fully grasp.

The question isn’t whether you’re intelligent. It’s whether you’re compressing. Whether you’re taking the noise of experience and distilling it into signal.

That’s the practice. That’s the work.