@nim_lang just did a quick (& stupidly simple) benchmark comparing asyncHttpServer with a plain @nodejs http server. The #Nodejs server seems ~3.5x faster than the #nim one... I did expect opposite results! Am I doing something stupid or is this expected?
After setting up the @nim_lang "-d:release" flag and changing the @nodejs implementation to use the cluster module, these are some new results.
@nim_lang is winning (as expected) but just slightly (62k vs 51k req/sec). Again, this is a silly benchmark, but nonetheless a good proof of @nodejs raw performance with async workflows!
Hey @nim_lang, is there any way (and maybe an example) on how to respond using content-encoding chunked and avoid to send the "content-length" header? I spent a couple of hours on this today but I wasn't able to figure out a decent solution for it :(
• • •
Missing some Tweet in this thread? You can try to
force a refresh
How many times did you have to store settings (creds & other preferences)? Where do you save the conf file? Which format do you use? How do you load and update the file?
Conf takes care of all of this (and more!) with an extremely simple API
"Yo, why are #JavaScript and its ecosystem so messy?!" 😡
Well, I am glad you asked... Let me tell you a story! 🤓
🧵👇
For starting... #JavaScript was not designed to be the language that it is today!
JS was created in 1995 by @BrendanEich for Netscape, a web browser that was trying to come up with a language to make the web more interactive
@BrendanEich#JS wasn't related w/ #Java, so why did they call it Java-Script?! Duh! 😳
Java was trendy! it was possible to build interactive sites by embedding Java apps in pages (applets). So it was mostly a #mktg move: "JS: the lightweight Java alternative" or something like that I guess
This is easy and it works! But the implementation is very specific to our struct.
The rest of the codebase doesn't really know that this type can be converted to a String and therefore you cannot build abstractions on top of this... 🤨
1️⃣ You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.
2️⃣ Measure. Don't tune for speed until you've measured, and even then don't unless one part of the code overwhelms the rest.