I've been creating videos on my youtube channel that you rarely will see in another place on the internet 🤩
You'll find there subjects like Recreating @nodejs from scratch, Web APIs and recreating web protocols such as the Web Socket using JS with no frameworks, etc
/2
And others, which are amazing experiments, such as recreating a code coverage tool from scratch and how to process terabytes of data using JavaScript
If you search about those subjects you'd reach out to my videos but why not have them as blog posts as well?
/3
Estava produzindo uma super aula do meu curso de Node.js Streams (em inglês), ensinando sobre como paralelizar o processamento de arquivos usando Node.js
A ideia é subir um processo para cada arquivo, e cada processo filtra os usuários que possuem o email em dominio gmail
/2
Só que eu automatizei a validação para verificar que todos os itens foram processados e enviados para um arquivo de saida
Então primeiro fui lá e usei o `grep` para filtrar o texto do arquivo e o `wc -l`, para obter a quantidade de linhas.
/3
The secret for processing anything using JavaScript is to handle data on demand.
Imagine data you wanna migrate data from a SQL database to a NoSQL DB. You would need to apply some business rules, clean up fields, filter data and then output them to the final output.
/2
You might know that you can block the Node.js (and the data source you're consuming) if you handle too much data at once in memory
The best practice then is to limit results, send individual data to a stream pipeline, and then ask for more data until you've consumed it all.
/3