Also being a lurker I havent talked in depth on this with anyone, so please take it with some salt and correct me if I'm wrong.
I am looking at lots of argument about PL, and pointing out the common pattern of the wrong one.
Those argument might be very true if we take account of the time/place/situation it is made.
I will point out why you should still care later on.
It raise me a giant red flag everytime I see them, and should do the same to you too.
This argument had been used on high level language (COBOL), where ppl think machine cant compile stuff cause machine cant think and thus cant program.
Chess AI is also quite close.
PPL also use this for 'OO is fundamentally stateful/bad for distributed system' and 'imperative language is not good for distributed system',
while ignoring simula/emerald/verdi/probabilistic hoare logic.
Also 0 and negative apparently cause ton of controversy too.
Do you know what else is impractical?
Assembly. Von Neunmann once get mad for his PHD student for wasting cpu cycle on assembler. Personal Computing. Internet.
And predicting the future is hard.
PPL love to compare language to C for speed. But C features used to be slow: especially algebraic expression ((a+b)*(c+d)) (10x slowdown), and function call (see lambda ultimate goto).
And the thing is, if ppl can hand-compile high level code to assembly by reasoning about it, why cant a machine?
What is slow/quick is only dedicated by compiler technology.
Focusing on how quick your code run on a single thread, ignoring the above and what language enable them is somewhat bikeshedding.
This claim is ignorant to computer architecture, where assembly provide one of the greatest abstraction in cs.
Also this claim suggests we should ditch algebraic expression in C.
And mutation is espressed as flip flop (read: recursion!) so if anything this is suggesting we should use recursion, and model assignment with them.
The extreme example is back in the time where ppl cant has optimzing compiler for lot's of common stuff today, all but some high level language(Fortran) is slow
So ppl hard wire fucking ALGOL(and other language) interpreter into CPU. It cant take machine code, cant take assembly. Only take human readable ALGOL.
By all mean this is a praise. Structural Programming/ALGOL/Lisp60/Simula/Pascal/Logo/Scheme/(If not practical enough, C++ and BASIC) come from researcher,
while real programmer use FORTRAN with goto.
Sure, people are doing theorem proving and category theory and program calculation in haskell, but Reynold/Dijkstra also do all of those in ALGOL.
Another thing is all reason for mathphobia seems to also applied to a large java(or any other language with lots of use) codebase. But java ppl never say their code base is too mathy.
Some words in programming is very ill-defined, having different meaning that contradict each other.
And ppl use them in a humpty dumpty sense-"When I used a word, it means just what I choose it to mean- neither more nor less."
For those who hadnt read John's paper, the thing is Lisp at that era doesnt has closure. Also doesnt has macro. Let that sink in.
along the line of "completely no effect, not even State Monad, and do stuff by program calculation" and "IO!".
and if you has effect via State/ST you need hoare logic/sepration logic one way or the other (Dijkstra Monad).
The word "Lisp" also should not exists. IMHO it is as absurd as calling languages with curly braces with C*.
And I dont know whether I should watch with awe, with horror or with amusement as people try to reason the inherent property of Lisp.
I guess the downside of not using this word is we stop fooling ourselves with pretty words and sound so boring.
Typing Dynamic Typing! Well Typed Program cant be Blamed! Lazy Functional State Thread! Extensible Effect! Correspondance between ALGOL 60 and Lambda Calculus! Essence of ALGOL!
If expression went from Lisp59(not published) to ALGOL 60(John is one of the big influencer in ALGOL).
ALGOL has lexical scoping which later get into scheme.
Scheme is inspired by Actor.
Smalltalk is inspired by F expression. Also Logo, which is sort of Lisp dialect.
Denotational Semantic is used on imperative language a lot.
Subtyping mostly use lambda calculus as a base model, and class is modeled by Fixpoint of stuff.
John Backus built fortran and wrote about <can programming be liberated...>.
It is Mixing 'OOP'/'FP'/'Imperative' and I shouldnt even use the word mixing because mixing mean some boundary is crossed but it is more like the boundary is artifical all the time!
But not really into 'OOP' camp, 'FP' camp, 'Imperative' camp.
Beware of big words without further justification. It is the equivalent of 'the proof is trivial' only with higher chance of being wrong.
Beware of Humpty Dumpty.
Beware of Polarization.
If I restrict myself to the current, why cant I still make these claim?
You can. All I say is that, when we make such claim, we should be certain that whether we are arguing for the current, or for the inherent.
Or we will end up fooling others.
It take away our intellectual modesty and give false comfort.
So (If you allow me to make some wild generalization which I cant back) you will have less chance to break ground, or to follow ground breakers and be early adopter.
Even if you dont do PL this probably will effect you.
Watching out at inherent/natrual argument and polarization is my second 'nature'.
Or why some other big name at PL also contribute to other fields.
(And yes, I am suggesting dijkstra is intellectualy modest. I mean to research subject, not to other people).