Abstraction in CS

Most people confuse the idea of abstraction with the idea of information hiding. Abstraction does not mean "putting all the details behind some interface."

Abstraction is the process of isolating and identifying patterns, giving those now-recognized patterns names, and thinking and acting in terms of those patterns per se vs. instances of those patterns.

Numbers are an abstraction we deal with every day starting from a very young age. Consider the number 5. You can't point to the number 5. You can't touch, smell, or taste the number 5. You have never once encountered the number 5 walking around outside.

That said, the number 5 is an abstraction of many concrete every day experiences. You can't touch the number 5, but you can touch 5 apples. You can't smell the number 5, but you can smell 5 cows (and how!). You can draw me 5 smiley faces or 5 hearts.

What is 5 except something that all these groups-of-five have in common? It's an abstraction that isolates some aspects of these groups-of-five that we find relevant and only those aspects, discarding the particulars. It's powerful because it encapsulates 5-ness per se without needing to answer "5 of what?" It's not 5 of anything — it's just 5.

Even that symbol, "5", is still just a picture of the number 5. 5 is prime whether we write is as 5 (decimal) or 101 (binary) or V (roman numerals) or IIIII (tally marks) or 五 (Japanese). The number 5 is not an interface. It is not "hiding details." It is a pure expression of a pattern we've isolated and elevated to a thing-in-itself.

Did you know the first model of computation — the lambda calculus — was first written down in 1936, long before we had a computer that could ever possibly implement it? What "information" or "implementation details" could this model have possibly been hiding? The first fully programmable digital computer wouldn't even be invented for another decade!

Instead, the lambda calculus was an attempt to isolate and define what we meant by "computation." It allows us to think more clearly about computation per se. It allowed us to start making statements about what computers could and could not do (even absent a working computer).