Here's what I remember about that time.
A new JVM language would come out every few years and then slowly fade away.
If you think web developers are treated 2nd class today, they were 4th class.
All the dominant languages at the time had grown up in the hayday of linear Moore's law. Computers had one processor and it just kept getting faster. As a result, high level languages had done less than zero performance work.
Not to be outdone, Ruby was even slower.
* Performance doesn't matter cause we "optimize for developer time which is more expensive" 👈 not exaggerating, this was the dominant view of the Rails community.
* Anything that doesn't do processing parallelism is futile.
IO concurrency and processing concurrency were lumped together as effectively the same thing.
Without IO concurrency these applications were very expensive and slow. For many of us, the need was clear
What I don't see yet is what all of the dominant platforms are getting so wrong that these new platforms can resolve.
But what are *all* of them getting wrong?
What is so painful to do in *any* platform that it creates space for a new one?
I think what will matter is, how well does this language compile to WebAssembly?
Being that JS is the market leader by every metric (active users, developers, packages, etc) that certainly creates space for a new language/platform that conforms well to WebAssembly's constraints.
Being good at cryptography and content addressed datastructures, will matter.
Also, Rust is already quite good at all of this stuff. It compiles well to WebAssembly, its internal data structures have many of the same contraints content addressed data imposes.
Rust, not JS, may be setting the bar.