Unconvincing article. PLT is a theory subject, as it says right in its name. If you're a practitioner and don't get off on theory, you're not going to be designing bleeding edge languages or compilers anyway, so you can stay pretty hip by just using the theory-adjacent stuff that comes out of the research places (example: Haskell). Just like if you're an electrical engineer, you might want to keep up with new kinds of semiconductors, but you don't need to study bleeding edge physics.
If you want to get started on PLT, Harper's PFPL is pretty accessible (https://www.cs.cmu.edu/~rwh/pfpl/). Even Martin-Löf's article on intuitionistic type theory (the one that introduced dependent types) is fairly readable for a PL geek.
I'm unfamiliar with Barendregt's article but it sounds too mathematical by comparison. I.e. by the title I'd classify it as mathematical logic rather than PL. Remeember there were no computers when Alonzo Church invented lambda calculus in the 1920's.
Yes, Harper's book that I linked above is good for that, imho. It looks like the PDF that is there now is a subset of the chapters of the 2nd edition. In an older version of the page, there was a complete pdf of the 1st edition and that's the one that I used. You might be able to find it in Wayback if you go back to 2015 or so.
Quoting Pierce’s book Types and Programming Languages (which I’m reading now): “Barendregt’s article (1992) is for the more mathematically inclined.” That’s in a book designed for a graduate course.
Programming language theory has been successfully taken over by functional programming, while programming languages in practice, those used by people, are largely not functional languages.
This means we users, and specifically for me, compiler developers rarely get to use the fruits such theory and research, making tools a but janky in practice.
i'm not convinced takeover is a fair description. its true that functional languages are over-represented in the more theory-laden subfields, but this is in on small part because functional languages are easier to reason about. 'reason' here is not intended in any prescriptive sense, just in the sense that they are more mathematically circumscribed. proving anything in an imperative language is going to mean proving it in the purely functional fragment of that language first and then drawing the rest of the owl, so when someone wants to try something new they're probably going to try it in mice first.
Programming languages in practice borrow heavily from functional programming. Polymorphic types, closures, generators, immutability, various nomadic things etc all came from there.
> These are really concepts needed to make up for the limitations of a pure functional layer of abstraction in an imperative world.
Yet they have been found to be much more widely useful. Closures as callbacks, generators as the basis for coroutines, and monads for error handling, not to mention things like C#’s Linq.
"Standing on the toes of giants" is a great phrase.
Unfortunately a lot of CS theory really is of limited application. The stuff that is, then takes a very long time to make it into implementations, and then the implementation has to climb the adoption curve. Facing at every step of the way entrenched users who don't like change. See the C++ -> Rust transition, for example.
Personally I like the way that C# has gradually introduced functional-flavor material into an OO imperative language.
A pet peeve of mine is theory-oriented texts that present code in a specific programming language...replacing the actual syntax with mathematical symbols. This is unhelpful for practitioners.
Another peeve is when authors present algorithm pseudo-code in terms of mathematical symbols, where it isn't at all obvious how to compute the value of the symbols in practice. You've shown that there is an algorithm, but not in a useful way.
I mean, I see the value of mathematical notation; I have a math degree. But still....
Here's how to justify it:
If you want to talk about something (program), you want the right language. We've got powerful languages capable of everything any computer can do, but new ideas often call for new ways to express them.
Building a language is about accepting massive up-front investment to build something that hopefully meaningfully improves user experience. Or you can build to learn about the inherent beauty of computation and formal languages, but I find that a little less compelling.
I think there could be practical applications. The type systems of most commonly used languages are not very sophisticated.
The problem is the focus on lamda calculus imo. I get it that it's nice for mathematics and proofs, but for (almost all) practical applications it's useless.
Why don't they make it less "hard" by allowing the use of a good proof-oriented programming language, with importable packages of established proofs and techniques, instead of having to rewrite each proof from scratch?
The research community has failed the "ordinary" interested programmer wanting to learn PLT by not producing an overview/top-down book which brings together the various strands of needed knowledge into a coherent whole. Pierce/Harper/etc's books are simply overwhelming for a beginner without the requisite background knowledge.
PLT approaches the idea of Computation from a formal system (i.e. language) pov and that is what needs to be communicated with the needed logical/mathematical knowledge inline. I have found the following useful in my study of PLT;
If you want to get started on PLT, Harper's PFPL is pretty accessible (https://www.cs.cmu.edu/~rwh/pfpl/). Even Martin-Löf's article on intuitionistic type theory (the one that introduced dependent types) is fairly readable for a PL geek.
I'm unfamiliar with Barendregt's article but it sounds too mathematical by comparison. I.e. by the title I'd classify it as mathematical logic rather than PL. Remeember there were no computers when Alonzo Church invented lambda calculus in the 1920's.
However providing people with software engineering background an easier on ramp for understanding PLT would be nice, wouldn't it?
This means we users, and specifically for me, compiler developers rarely get to use the fruits such theory and research, making tools a but janky in practice.
Pre-dates usable functional programming languages.
> immutability
Good point. Even if it doesn't have much to do much with functional programming, it did originate from FP languages.
> closures, generators, monads
These are really concepts needed to make up for the limitations of a pure functional layer of abstraction in a imperative world.
Yet they have been found to be much more widely useful. Closures as callbacks, generators as the basis for coroutines, and monads for error handling, not to mention things like C#’s Linq.
Unfortunately a lot of CS theory really is of limited application. The stuff that is, then takes a very long time to make it into implementations, and then the implementation has to climb the adoption curve. Facing at every step of the way entrenched users who don't like change. See the C++ -> Rust transition, for example.
Personally I like the way that C# has gradually introduced functional-flavor material into an OO imperative language.
Another peeve is when authors present algorithm pseudo-code in terms of mathematical symbols, where it isn't at all obvious how to compute the value of the symbols in practice. You've shown that there is an algorithm, but not in a useful way.
I mean, I see the value of mathematical notation; I have a math degree. But still....
Building a language is about accepting massive up-front investment to build something that hopefully meaningfully improves user experience. Or you can build to learn about the inherent beauty of computation and formal languages, but I find that a little less compelling.
The problem is the focus on lamda calculus imo. I get it that it's nice for mathematics and proofs, but for (almost all) practical applications it's useless.
Who's preventing this now? Take a look at things like Software Foundations and other works which are built on automated or interactive proof systems.
https://softwarefoundations.cis.upenn.edu/
The research community has failed the "ordinary" interested programmer wanting to learn PLT by not producing an overview/top-down book which brings together the various strands of needed knowledge into a coherent whole. Pierce/Harper/etc's books are simply overwhelming for a beginner without the requisite background knowledge.
PLT approaches the idea of Computation from a formal system (i.e. language) pov and that is what needs to be communicated with the needed logical/mathematical knowledge inline. I have found the following useful in my study of PLT;
Formal Syntax and Semantics of Programming Languages: A Laboratory Based Approach by Ken Slonneger and Barry Kurtz - https://homepage.divms.uiowa.edu/~slonnegr/ and https://homepage.divms.uiowa.edu/~slonnegr/plf/Book/
Understanding Formal Methods by Jean-Francois Monin.