We are slow to catch-up with the dropping Punk floor, hence we see show a the 57-60 ETH listings as good value. Newest iteration of the model will adjust.. but I do think the top three Punks we show are good value
* Zombie listed 1.1KΞ
* Beanie 285Ξ
* Bandana 60Ξ
The Beanie has green (ask) and orange (bid) dots approaching each other. When this happen, the pieces always sell.
Knowing a lot about applied machine learning, means every day you're
* explaining basic things to other so-called experts (try to be nice)
* have VP types ask you to work for "their team"
* have them ask to send candidates "their way" since you're not available
for all the hype and anti-hype on web3, ML, deep learning etc -- machine logic isn't going anywhere, and there's a strong demand of this skill set. Especially when paired with common sense, product focus, attention to detail and a little patience
if you're young and think you have a mind for these things, I would encourage you to pursue applied machine learning. We will *never* run out of things to do. And people will be grateful to have you. Including those tech VPs -- a lot of whom are pretty cool tbh
I am pro vaccine (see previous posts). You should most likely get it.
This isn't about the freedom of choice issue, or inflatioooon, etc. Just obstinate to think some combination of shots and no more covid.
At a high level there are two viable options: 1) intense system of measures, like China, NZ, Israel for a while -- expensive, intrusive, AFAIK temporary 2) provide a ton of mitigation measure (including but not only vaccines), protect the vulnerable, learn to live with it
This account has become mostly CT, but I still care (deeply) about deep learning and large language models.
While models have gotten bigger and better, it seems this is having surprisingly little effect on downstream applications...
🧵 (1/n)
The growth in parameter counts has been extraordinary. I had a tiny part to play, my friends and team-mates have been on the forefront, in the lab, taking state of the art language models from 300M params to 8B-11B (when I was there), to 1/2 T params
(2/n) developer.nvidia.com/blog/using-dee…
Work from Nvidia. MSFT, OpenAI, Google and FB research, has transformed NLP into a large scale deep learning field. It's amazing you can encode that many params, go through that many documents, in 100+ languages. Even handle everything as bytes...