I am a bit down these days. And there is some twitter argument about PL stuff, so I should write a rant. It's somewhat long.
Also being a lurker I havent talked in depth on this with anyone, so please take it with some salt and correct me if I'm wrong.
I am not looking at any particular claim, any particular argument for that claim, and say it is wrong.
I am looking at lots of argument about PL, and pointing out the common pattern of the wrong one.
Also I am arguing against intellectual/inherent-wise, not demo-graphical/statistical-wise.
Those argument might be very true if we take account of the time/place/situation it is made.
I will point out why you should still care later on.
You might also use those argument correctly, but FWIK it is rarely done.
It raise me a giant red flag everytime I see them, and should do the same to you too.
0: It is inherently/fundamentally x/y/impossible.
This argument had been used on high level language (COBOL), where ppl think machine cant compile stuff cause machine cant think and thus cant program.
On feyman diagram and inherent complexity.
Chess AI is also quite close.
PPL also use this for 'OO is fundamentally stateful/bad for distributed system' and 'imperative language is not good for distributed system',
while ignoring simula/emerald/verdi/probabilistic hoare logic.
1: It is unnatural. Mostly used as imperative programming is natural, or in conjunction with inherent/fundamental.
Really? The red bar below stand for teen programmer who get them wrong.
And number:we take for granted, but take centuries to invent. So much for naturalness
From Bret Victor.
For 'natural' number please see journals.sagepub.com/doi/pdf/10.117…
Also 0 and negative apparently cause ton of controversy too.
2: It is impractical.
Do you know what else is impractical?
Assembly. Von Neunmann once get mad for his PHD student for wasting cpu cycle on assembler. Personal Computing. Internet.
By using it the speaker assume he is perfect at credit assignment and can see what work and what doesnt in the future.
And predicting the future is hard.
3: It is slow.
PPL love to compare language to C for speed. But C features used to be slow: especially algebraic expression ((a+b)*(c+d)) (10x slowdown), and function call (see lambda ultimate goto).
That is why father of Fortran, John Backus, got a turing award. Before Fortran anything above assembler = significantly slower.
It got faster because we are better at compiling stuff.
And the thing is, if ppl can hand-compile high level code to assembly by reasoning about it, why cant a machine?
What is slow/quick is only dedicated by compiler technology.
Also by choosing to focus on contemporary benchmarking, we ignore all the 10x or more gain. Staging. GPU. ASIC. Parallelism. Distributed Computing.
Focusing on how quick your code run on a single thread, ignoring the above and what language enable them is somewhat bikeshedding.
4: It is close to hardware, so is more natrual, and good. Mostly used on assignment and against tco.
This claim is ignorant to computer architecture, where assembly provide one of the greatest abstraction in cs.
Also this claim suggests we should ditch algebraic expression in C.
CPU isnt a thing where stuff get executed sequentially (VLIW, superscala, speculative execution, pipeline).
And mutation is espressed as flip flop (read: recursion!) so if anything this is suggesting we should use recursion, and model assignment with them.
Worth mentioning: it also assume hardware architecture is prettymuch the same, but in fact this is very far from truth.
The extreme example is back in the time where ppl cant has optimzing compiler for lot's of common stuff today, all but some high level language(Fortran) is slow
Maybe Even C++ (I had read about fortran expert laughing at C++ but it is personal ancedot anyway).
So ppl hard wire fucking ALGOL(and other language) interpreter into CPU. It cant take machine code, cant take assembly. Only take human readable ALGOL.
And if you want to use asm/machine code on them you need to compile/interpret assembly! So much for 'directness'/'one-to-one correspondance to asm'.
5: It is academic.
By all mean this is a praise. Structural Programming/ALGOL/Lisp60/Simula/Pascal/Logo/Scheme/(If not practical enough, C++ and BASIC) come from researcher,
while real programmer use FORTRAN with goto.
So is CAD(sketchpad) and timesharing. Given that researchers are far less in number then other programmers, if anything this is an indicator of success.
6: Another thing that oftenly goes with this, is 'too mathy', mostly used against functional programming of all sort.
Sure, people are doing theorem proving and category theory and program calculation in haskell, but Reynold/Dijkstra also do all of those in ALGOL.
The thing is just because some ppl can do them with math doesnt mean you must too.
Another thing is all reason for mathphobia seems to also applied to a large java(or any other language with lots of use) codebase. But java ppl never say their code base is too mathy.
It is a completely different issue and I am unready to talk much about it though. Wont go into it and dont expect you to agree. Anyway back to the topic.
7: Being Double standard, whether intentional or not.
Some words in programming is very ill-defined, having different meaning that contradict each other.
And ppl use them in a humpty dumpty sense-"When I used a word, it means just what I choose it to mean- neither more nor less."
It get much worst when sometime ppl switch between different meaning at the same sentence, e.g. "Lisp is from 1959 and is still state of the art".
For those who hadnt read John's paper, the thing is Lisp at that era doesnt has closure. Also doesnt has macro. Let that sink in.
Functional programming suffer the same problem - ppl has different definition of fp,
along the line of "completely no effect, not even State Monad, and do stuff by program calculation" and "IO!".
The thing is, if you use program calculation you will have a hard time reasoning inside IO,
and if you has effect via State/ST you need hoare logic/sepration logic one way or the other (Dijkstra Monad).
It will be much nicer if we ditch the word fp, replace it with program calculation/effect control/higher order function.
The word "Lisp" also should not exists. IMHO it is as absurd as calling languages with curly braces with C*.
There are extremely little common property between CLisp, Scheme, Racket, JMC Lisp, Lisp1.5, Lisp with M-Expression, Black, MiniKanren.
And I dont know whether I should watch with awe, with horror or with amusement as people try to reason the inherent property of Lisp.
OOP also has different definition (Simula/Smalltalk/Lambda Calculus with Subtyping/Object Calculus) but I dont study them alot so I will not talk much about it.

I guess the downside of not using this word is we stop fooling ourselves with pretty words and sound so boring.
Bonus: thinking machine is also a horrible term. "The question of whether a computer can think is no more interesting than the question of whether a submarine can swim." - Dijkstra
8: Polarization: 'us' or 'them'.
Typing Dynamic Typing! Well Typed Program cant be Blamed! Lazy Functional State Thread! Extensible Effect! Correspondance between ALGOL 60 and Lambda Calculus! Essence of ALGOL!
Haskell's overlooked object system! Final Tagless for extensibility! Subtyping and Module System! Lambda the Ultimate XXX! The list can go on but I think it is enough counter example.
In history, idea also flow from language in one 'camp' into other 'camp' as well.
If expression went from Lisp59(not published) to ALGOL 60(John is one of the big influencer in ALGOL).
ALGOL has lexical scoping which later get into scheme.
Scheme is inspired by Actor.
Simula is a faithful ALGOL dialect.
Smalltalk is inspired by F expression. Also Logo, which is sort of Lisp dialect.
Denotational Semantic is used on imperative language a lot.
Subtyping mostly use lambda calculus as a base model, and class is modeled by Fixpoint of stuff.
Luca(influential man on OOP) help with ML module.
John Backus built fortran and wrote about <can programming be liberated...>.
Essence of ALGOL worth a special mention since it has subtyping, effect control, and is ALGOL.
It is Mixing 'OOP'/'FP'/'Imperative' and I shouldnt even use the word mixing because mixing mean some boundary is crossed but it is more like the boundary is artifical all the time!
And sure, feature often dont play well with each other (subtyping vs inference), and maybe it make sense to split into 'subtyping' camp and 'inference' camp.
But not really into 'OOP' camp, 'FP' camp, 'Imperative' camp.
As a side note, 0/7/8 often appear together, as if people is trying to get combo point by chaining up spell.
Takeaway:
Beware of big words without further justification. It is the equivalent of 'the proof is trivial' only with higher chance of being wrong.
Beware of Humpty Dumpty.
Beware of Polarization.
Going back to 0/1/2/3:
If I restrict myself to the current, why cant I still make these claim?

You can. All I say is that, when we make such claim, we should be certain that whether we are arguing for the current, or for the inherent.
Or we will end up fooling others.
Even worse, we will be fooling ourselves, by convincing ourselves that the current programming language/methodology is great, and only need a bit patching here and there.
It take away our intellectual modesty and give false comfort.
The ground is so good, so established, no need and impossible for ground breaking.
So (If you allow me to make some wild generalization which I cant back) you will have less chance to break ground, or to follow ground breakers and be early adopter.
Until cs revolve just like cs do every 5/10/20 years depend on how you count and you are the late adopter and as you move on you forgot ground is broken.

Even if you dont do PL this probably will effect you.
I think if one is intellectualy modest in one way, he/she is probably continue being the same, whether in other field or not.
Watching out at inherent/natrual argument and polarization is my second 'nature'.
Maybe this is why, after backus invent fortran, he dont believe real programmer use fortran, and gave his immortal speech.
Or why some other big name at PL also contribute to other fields.
Alan Kay to UI. John to Time Sharing. Dijkstra to that hell of a long list at wikipedia page.
(And yes, I am suggesting dijkstra is intellectualy modest. I mean to research subject, not to other people).
Last sentence. I recommend <The Dream Machine>. The best history book I read.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Marisa "Long Middle Name Looks Cool" Kirisame
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!