I’m an old time Unix guy who’s still more comfortable on the command line than a big GUI. That’s probably why I enjoy living in Emacs to the extent possible. One of the most powerful concepts from Unix is the idea of small tools that do one thing (usually to a text stream) well and to connect those tools with pipes to do a larger, more complicated task.
I was looking through my queue of blog ideas and came across this Eric Raymond post from last March. In it, he introduces the concept of “semantic locality” as a way of explaining when and how the set of small tools operating on a text stream works. You really need to read his post to understand his argument but the TL;DR is that text stream concept works when the data has semantic locality. That is when you can do useful work on small pieces of data that are (mostly) contiguous. You can see right away why data with semantic locality is susceptible to the small tools approach. As esr points out, even more it tells you when that approach probably won’t work.
Semantic locality is really just a formalism of our intuition. Even if you’ve never heard of it, you can—if you’re an experienced Unix hand—probably decide when a pipeline is a good idea and when it isn’t. It’s useful though because it gives us a systematic way of thinking about the problem. I found the post interesting and if you’re a Unix guy or gal you probably will too.