The Unix Philosophy, Revisited
Note: This is a test, not an actual blog
In 1978, Doug McIlroy summarized the Unix philosophy in four rules. Fifty years later, they read like a design manifesto for the best software being built today.
- Write programs that do one thing and do it well.
- Write programs to work together.
- Write programs to handle text streams, because that is a universal interface.
The fourth: build a prototype as soon as possible. Throw the first one away; you will anyway.
Do One Thing Well
grep searches. sort sorts. wc counts. cut extracts columns. None of them do anything else.
This seems like an obvious constraint until you try to apply it to the software you’re building. Most programs gradually accumulate features that belong elsewhere. A JSON formatter that also makes HTTP requests. A test runner that also manages environment variables. A deployment tool that is somehow also a package manager.
The cost of scope creep isn’t just complexity — it’s that each feature the program takes on is a feature another specialized tool can no longer do better. The space of composability shrinks.
When I’m deciding whether a piece of functionality belongs in a module, I ask: if I extracted this into its own tool, would it be useful on its own? If yes, it probably should be its own thing.
Write Programs to Work Together
The tools that endure are the ones that don’t assume they’re alone. They read from stdin, write to stdout, signal errors on stderr, and exit with a meaningful status code.
# Four programs, one pipeline
git log --oneline | grep 'feat:' | cut -d' ' -f1 | xargs git show
None of these tools were designed to work together. They work together because they all agreed on a common interface: text streams.
The modern equivalent is the principle behind Unix pipes applied to services: design your APIs so they can be composed. Return structured data. Accept structured input. Don’t bake in assumptions about what comes before or after.
The Universal Interface
Text is the universal interface because every program can read and write it without a protocol negotiation. Binary formats are faster but require both sides to agree on encoding. Text requires nothing — if you can print, you can talk to anything.
JSON is the text stream of the networked era. It’s not perfect, but it’s universal. The programs that adopted it early — jq, fzf, the entire JSON API ecosystem — benefit from the same composability that Unix pipes enabled.
curl -s api.example.com/users \
| jq '.[] | select(.active) | .email' \
| sort \
| uniq
This is a Unix pipeline. The middle tool is a language, not a command. The philosophy scales.
The Prototype You’ll Throw Away
The fourth principle is the most underrated and the most violated. McIlroy says to build a throwaway prototype. Not a “rough draft” that becomes production — a genuine prototype you plan to discard.
The reason is epistemic: you don’t know what you’re building until you’ve built it. The first version reveals the actual requirements. The second version, built with that knowledge, is the real thing.
Most teams can’t bring themselves to throw code away. It took time to write. It works, mostly. Rewriting feels like waste. But the cost of the first design living forever is borne by everyone who has to maintain it — and it compounds.
The software I admire most has obviously been redesigned. The abstractions are crisp, not accumulated. The interfaces make sense, rather than reflecting the history of decisions that led to them.
The Unix philosophy is not a historical curiosity. It’s a practical checklist. Before shipping a feature, I still ask: does this belong here? Can it work with things that don’t know it exists? Would I build it this way if I were starting over?
Usually, the honest answers are illuminating.