BNF

As i’m learning lots about static analysis, i’m re-discovering some classic computer science papers, such as the one (1977) by john backus (of BNF and FORTRAN fame):

Conventional programming languages are growing ever more enormous, but not stronger. Inherent defects at the most basic level cause them to be both fat and weak: their primitive word-at-a-time style of programming inherited from their common ancestor–the von Neumann computer, their close coupling of semantics to state transitions, their division of programming into a world of expressions and a world of statements, their inability to effectively use powerful combining forms for building new programs from existing ones, and their lack of useful mathematical properties for reasoning about programs.

i can relate to his statement that Programming languages appear to be in trouble. each new language adds new “features”, yet little changes, and we are still thinking in terms of state machines, very low-level indeed.
further down, we learn that

The models of computing systems that underlie programming languages fall into 3 classes: (a) simple operational models (e.g., Turing machines), (b) applicative models (e.g., the lambda calculus), and (c) von Neumann models (e.g., conventional computers and programming languages). Each class of models has an important difficulty: The programs of class (a) are inscrutable; class (b) models cannot save information from one program to the next; class (c) models have unusable foundations and programs that are conceptually unhelpful.

the main argument is that since traditional languages model the behavior of hardware, they are bound by its limitations:

Thus variables = storage cells; assignment statements = fetching, storing, and arithmetic; control statements = jump and test instructions.

i always hated functional languages when we studied them (they seemed less useful for interfacing with APIs, which is what most programming these days is about), but maybe i should reconsider.

Leave a comment