Here are the annotated slides from my W.P.E. presentation. Any and all comments are welcome at
The obvious –and usual–remarks about copyright (and copyleft/copywrong) are expected to be honored.
1. This was a real filing cabinet –without the palm tree—but that’s
another story. The point it that it describes the structure of this talk
… the “beginning, middle, end” if-you-will …while hitting some
points that did not get into the two-page abstract.
2. First, the obligatory introduction. The slide pretty much says it.
I’m not a philosopher; not an engineer; and not a historian. Just a
From wanderer to engineering to math to logic. Finally to C.S. about
But not just any C.S.
It was McCarthy’s version and at Stanford, with their very strong
program in logic and foundations of mathematics.
3. Now the talk. It’s my decade-long voyage through intellectual
and mathematical history and philosophy with the goal of
unwinding the development of modern engineering and engineering
The overarching goal is to make the case that software is destined to
History and Mystery are interlinked. They deal with mathematical
physics and the interlinking with several Western cultures, and
how/why different societies dealt with the transformation of the new
mathematically-based results into engineering.
The Ballast section is the heart of the
matter. It’s an answer to a question I had: Can we bring rationality
and discipline to software development; something similar to the
structure developed over some 300 years of traditional engineering?
My answer is yes –but only if we want to.
4. There’s a difference between training and education –just as
there’s a difference between construction work and engineering–
and the following McCarthy quote expresses the challenge for
software. There are equally apt quotes from Christopher Strachey …
or Peter Landin ... or Tony Hoare
5. I was going for a “Glorious Revolution” picture since the
Glorious Revolution was an entry point to the Age of
Enlightenment, but Compleat Revolution was the best I had..
But first, some mathematical history.
6. The critical event that supports modern science and engineering
happened in the late 16th century: the introduction of symbolic
algebra by Francois Viete. There were some interesting –even
mysterious– interchanges between Viete in France and Harriot in
England, but Viete began the adventure.
Rather than solving specific equations, Viete’s use of symbolic
parameters allowed the statement of general solutions and required
the development of symbolic algebra. You can reduce (2+3)-2 by
arithmetic but reducing (a+b)-a requires symbolic algebra including
the notion of valid symbolic transformations.
****need a better example***
7. Just for comparison, pre-Viete algebras –besides being equation-
specific– were notationally clumsy. Think Roman versus Arabic
8. But the really powerful idea that was unleashed by Viete is
described here in its original form. He understood exactly what he
9. And now computational simulation also fits the diagram.
10. Looking back 400 years, the structure of his innovation is what
we now take for granted in all of our mathematical modeling.
11. But there’s something lurking --and usually unsaid-- in this
diagram: that the relationship between the subject-matter and the
model is somehow faithful. The most explicit statement of this
relationship is due to Kurt Godel. This representation relationship is
more applicable than the specifics of his Incompleteness results.
The relationship says that you can’t
just put some bullshit down on paper and say “it works!” That’s
called programming. Or in mathematics a conjecture. And Godel
did program. Godel numbers are an example of concrete data
structures, for example. But he did more.
Here what’s required is some justification –some range of
applicability—for your assertion that the representation is somehow
faithful. That’s his representation theorem. It’s an explicit
demonstration that a representation fulfills its intended purpose.
12. And here’s John McCarthy again.
Original Lisp was called Meta-
expressions. The language’s domain was Symbolic-expressions –or
S-exprs. Steve Russell noted that McCarthy’s Godel-like work in
representing M-exprs as S-exprs resulted in a notataion that –while
“weird” – was human-readable; something that Godel numbers were
not. So McCarthy “numbers” became a programming language.
13. A little cultural history
14. Nothing occurs in a vacuum. The mathematical and scientific
innovation beginning in the late 17th century occurred in and around
the Enlightenment. And actually there were several varieties of the
big “E” –Radical, Moderate, and (of course) Anti-E. (The Eels) The
national flavor of E greatly influenced how scientific ideas were
More specifically, the Radical E of
France supported the experimentation with scientific engineering …
of course it also lead to the French Revolution, but that’s another
The English, Scotch, and American versions were more “Moderate”
–bordering on “Anti” – and this spilled over into their attitude
toward theory-versus-practice and by extension to their attitude
toward new ideas in engineering. Hacking versus thinking. Practice
The defining difference was how Descartes was interpreted. It
turned on whether one questioned everything or ended the
questioning when it came to “altar and/or throne,” and the status
quo was given a pass. Btw: In the 17th century Descartes kept his
head by saying that rational thought let us see how god’s mind
worked. But by the 18th century the cat was out of the bag, and the
radicals asked the obvious rational question: “why god?”
15. Some quick specifics. I like mysteries. How did Newtonian
mathematical physics end up in France’s Academies?. Why France
and not England?
16. Quickly. Engineering developed in scope and geographical
Science-based engineering then
progressed from France to Germany, to England, and finally the
U.S. It’s interesting that the attitude toward E ideas, follows a
similar pattern in terms of Radicalism. Recall Kant’s 1784
newspaper article What is Enlightenment? It was radically tinted.
17. Though Descriptive Geometry was a “killer app” it wasn’t deep
theory. What situation “proved” theory’s worth? What put calculus
on the engineering calendar? The
“forcing event” happened with the Transatlantic cable. One simply
cannot rely on seat-of-pants practice to discover cable-breaks under
2 miles of water. Game; set; match.
18. Of course engineers get educated. And the style of education
directly results for the style of engineering practice –theory-driven
in France; practice-based elsewhere.
19. Monte Calvert expressed the division very aptly as a cultural
issue --“shop versus school.”
Shop-culture versus school-culture is another name for practice-
versus theory-driven. Or in the early days, Moderate versus Radical.
Clearly traditional engineering education is school-based. And
software engineering is shop-based. I think software engineering
needs to change radically. And soon.
20. Here’s current software engineering.
21. To make the case for a new approach we need a “forcing event.”
Something like the transatlantic cable did for electrical engineering,
but for software. Something that “the practical man” cannot do.”
A prime example involves the
insecurity of software and a potential solution: to specify
expectations and then require verifiable justification from those who
claim to meet those expectations. Current practice cannot address
this issue –and has no hope of doing so. We believe that Proof-
Carrying Code offers hope.
22. Here’s a short description of the technique.
So how do we get there? How do we
supply the mechanisms? There’re theory-based notions, of course.
They’re the Ballast. –
24. The Ballast is based on some implicit –sometimes explicit –
assumptions of engineering. Namely, that properties of a construct
are related to the properties of the components.
25. The ideas can be expressed as inference rules.
26. But now we’re face-to-face again with Viete and symbolic
notation and how to interpret its content.
27. That manipulation can appear in many forms … symbolic
reduction, computation … …whatever. But regardless, the critical
feature is property-preservation.
28. The issue of denotation versus sense is both logical and
philosophical. Since I’m not a philosopher, I’ll go straight to the
logical. And we’ve all seen a Denotational Logic in one form or
29. Since Sensational Logic is not so well-known, here’s a
Brouwer was a Dutch mathematician.
Heyting was his student.
Kolmogorov worked independently in Russia.
Curry was an American logician.
Kreisel, Scott, and Howard were at Stanford in the ‘60s –I ran into
these ideas from K.
Martin-Lof as a philosopher, mathematician, computer scientist
brought the ideas out of the logical realm and began to import them
into computer science as “type theory.”
30.Classical truth is illustrated by truth tables and Tarksi’s notion of
truth for predicate calculus (better: cylindric algebras, or Halmos’
algebraic logics). Not really interesting for our purposes.
31. Here’s the Real Ballast. The BHK interpretation.
To the Intuitionist, a declaration of
truth without its justification is vacuous. This attitude is reminiscent
of Descartes and the Radical Enlightenment: accept nothing without
But we need to answer “Is this semantics compostional?”
32. Indeed! There’s a simple translation of Intuitionistic truth to
something we can apply. And its semantics is compositional!
33. Here are some Natural Deduction rules demonstrating the two
logics. The Sensational rules are the ones of interest.
Note modus ponens.
34. Now it’s a simple step from constructive truth to programming
languages that now contain some basic assertional mechanisms that
we can exploit to create large-scale applications that can fit the PCC
ML is Scheme with strong types.
35. As always the difficulties are in the details. We need heavy-duty
tools, not just theory. It took decades before the fundamentals of
Newton’s mathematical physics became realistic engineering tools.
But here help is on the way.
The natural extensions of simple
constructive type theory allow us to express conditions like “buffer
overflow cannot occur.”
36. And we have a Representation Theorem, not just a “numbering.”
Though it can’t do the hard part –handle the full specification
problem– it does guarantee simple syntactic coherence.
This says that the program is “well-typed.” It means that the
program’s collection of type assertions is consistent. For example,
we don’t ask that x be an integer (x:int) and x also be a Boolean
The hard part requires the parasite to
supply a proof; the host has it easier since proof checking is a
simpler task. But that is as it should be.
37. And back to Godel numbers. We have morphed proofs and
propositions into programs and security assertions.
38. Finally; the point of this exercise is to allow the back-and-forth
to occur without executing the code. And that goes back to a
property that’s required of the language: it’s the property we
mentioned early, now given the name Subject Reduction.
Subject Reduction allows us to check assertions about dynamic
properties without running the program. This is critical for
something like Proof-carrying Code.
Subject Reduction says that throughout the reduction process the
type of the expression is unchanged. And in the limit, the value’s
type is the same as that of the original expression. So, for example,
if we show that the original program does not violate array-bounds,
then throughout its execution the code is also safe, and no run-time
bounds checks need to be included.