As part of our book club at Amyris Software Engineering, I recently watched the classic lectures of Structure and Interpretation of Computer Programs by Hal Abelson and Gerald Jay Sussman. These lectures were recorded in 1986 in front of a group of employees at HP.
It’s fascinating to watch these videos now in 2021 with the benefit of the last 40 years since they were recorded. It feels like I am armed with a time machine to be able to “jump ahead” and find out how things turned out- what software paradigms took root and how the field of computer science evolved.
One thing that was particularly salient for me is that even though I consider myself a “classically trained” computer scientist (I have a bachelors and master degree in CS), I’m a bit embarrassed to admit I didn’t, until watching these lectures, have a ready definition of what’s the point of Computer Science beyond making computers do what you want. Over the years computer science has expanded as a field into many sub-disciplines. For me, the definition shared in the first ten minutes of SCIP encompasses them all and gets to the heart of our discpline:
Computer science is the study of how to think and reason precisely about “how to” knowledge.
Imperative or “how to” knowledge captures the steps required to accomplish some task. It is distinct from declarative knowledge which defines facts and allows us to reason about what is true or false. The first lecture of the course give a straightforward example from mathematics, of declarative knowledge- the definition of the square root. \[\begin{alignat}{2} \sqrt{X} \text{ IS THE Y SUCH THAT} \\ Y^2 = X \text{ AND } Y >= 0 \end{alignat}\]
This is a great definition and it is certainly possible to prove precisely that the above statements are true. However, it doesn’t tell us how we would go about finding a particular values for x or y. How does one go about finding a square root of a number? For that you need imperative knowledge- in particular an algorithm!
You can watch the lecture for Abelson’s rendition of the required imperative knowledge, but if we jump forward to 2021 and some simple python you could use something like this:def sqrt(num):
"""Determine the square root of the given number `num`.
This method implements the `Babylonian Method` for finding square roots, a
piece of imperative knowledge invented in 1500 BC.
This is mentioned in the SCIP lectures, but I found the description here
helpful: https://blogs.sas.com/content/iml/2016/05/16/babylonian-square-roots.html
"""
guess = num / 2
while True:
guess = (guess + (num / guess)) / 2
if abs((guess * guess) - num) < 0.0001:
return guess
sqrt(12)
#=> 3.4641016200294548
Or if you are a prudent programmer in 2021 and as such perfectly willing to build on top of abstractions, you can use the python’s standard math library and run:import math
math.sqrt(12)
#=> 3.4641016151377544
Computer Algorithms as we all know are a really powerful class of imperative knowledge and it makes sense that writing them occuppies much of our time as software engineers. That said, algorithm as a class of knowledge predate computers by a wide margin. In the case of the Babylonian Method for determining square roots, the steps needed to compute them by hand were worked out in the 1500.
Embedded in in this fact is a realization which truly blows my mind- if computer science is about working and reasoning about imperative knowledge and imperative knowledge has been around since the dawn of human history then computer science is really only tangentally about these things we call computers today! Abelson says as much in the few minutes of the lecture. He says, “In the same way that physics is not about particle accelerators or biology is not really about microscopes, computer science is not about computers”. Particle accelerators and microscopes are tools used to elucidate facts “declarations” about the world. Similarly, computers are a tool for working with imperative knowledge.
If I were to hazzard a guess, Pen and paper are probably the most prolific tools of working with imperative knowledge and Von Neuman machines implemented first with vacume tubes and later silicon are probably the most recognizable. Clearly, they are not the only examples nor are we done inventing new ones.
It is with this perspective that I am so excited to be working in synthetic biology. There is a lot of hype around “programming cells” most of which has yet to be realized. That said, clearly our genomes encode a lot of imperative knowledge- they do afterall provide instructions for taking a single fertilized egg and turning it into a complete human being. Even, unicellular organims like bacteria or yeast encode instructions for a dazzling number of tasks. Decoding the imperative knowledge within as well as modifying them for our own aims will require many new tools utilizing both Von Neuman computers and manipulating physical material in the wetlab.
And with that, If you find yourself with a few minutes I highly encourage you to watch at least the first ten minutes of the first lecture https://www.youtube.com/watch?v=2Op3QLzMgSY of SCIP. Once Abelson and Sussman open your eyes, you will begin to see the echos of our poorly named discipline everywhere!
If you are interested in this line of thinking and with the SCIP perspective in hand I really enjoyed the book Turings Cathedral (nice summary blog post here:). Dyson provides an interesting historical account of the happenings at the institute of advance study at princeton, but is also a bit of slog. If you like science fiction, Neal Stephenson’s Cryptonomicon is fictional, but totally entertaining. One of the sub-plots follows the development of computers in the context of the race to break nazi codes by implementing a digital computer on something approximating an pipe organ.