The other day, I discovered this wonderful video from 2009. It’s an informal chat/whiteboard session with Rich Hickey, the creator of Clojure, on Microsoft’s Channel 9.
Watching it really helped solidify in my brain various concepts around Lisp and Clojure that I’ve been trying to learn recently in my spare time. I didn’t grasp every tangential reference to various debates in the world of languages and compilers (and there are many of them in this video!) but it’s still worth watching for the interested layperson who’s trying to delve into the world of functional programming.
The best part, for me, is Rich’s explanation of how data structures in Clojure are immutable yet very fast when it comes to copying. This seems counterintuitive, since, on the face of it, you would think that it’s very slow to have to copy whole data structures to make changes. The secret is that it doesn’t do a deep copy. The underlying implementation of lists and sequences uses shallow trees: “copies” of data only need to clone a short branch of the tree and reference a new root node, which also keeps the old data structure intact. The discussion is wonderfully elegant.
Having used only imperative languages my entire life, this recent foray into Lisps and functional programming has been really challenging and eye-opening. Putting aside its alleged strengths and advantages, I find the mathematical flavor of this style of coding very appealing. I don’t have a mathematically-oriented brain, but at the same time, oddly enough, something about Lisps seems truer to what I intuit the essence of computing to be. If I were formally trained in computational theory, I’m sure I’d have some fancy jargon to name this quality I’m trying to describe. As it stands, I can only say it feels close to the machine.