I am pro vaccine (see previous posts). You should most likely get it.
This isn't about the freedom of choice issue, or inflatioooon, etc. Just obstinate to think some combination of shots and no more covid.
At a high level there are two viable options: 1) intense system of measures, like China, NZ, Israel for a while -- expensive, intrusive, AFAIK temporary 2) provide a ton of mitigation measure (including but not only vaccines), protect the vulnerable, learn to live with it
Maybe there will be a monthly shot (or pill) everyone can get that will make #1 not necessary for eradication, at some point in the future. But we don't live in that world now.
Booster universalism is wishful thinking. And looking for population to blame is what it sounds like.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Knowing a lot about applied machine learning, means every day you're
* explaining basic things to other so-called experts (try to be nice)
* have VP types ask you to work for "their team"
* have them ask to send candidates "their way" since you're not available
for all the hype and anti-hype on web3, ML, deep learning etc -- machine logic isn't going anywhere, and there's a strong demand of this skill set. Especially when paired with common sense, product focus, attention to detail and a little patience
if you're young and think you have a mind for these things, I would encourage you to pursue applied machine learning. We will *never* run out of things to do. And people will be grateful to have you. Including those tech VPs -- a lot of whom are pretty cool tbh
This account has become mostly CT, but I still care (deeply) about deep learning and large language models.
While models have gotten bigger and better, it seems this is having surprisingly little effect on downstream applications...
🧵 (1/n)
The growth in parameter counts has been extraordinary. I had a tiny part to play, my friends and team-mates have been on the forefront, in the lab, taking state of the art language models from 300M params to 8B-11B (when I was there), to 1/2 T params
(2/n) developer.nvidia.com/blog/using-dee…
Work from Nvidia. MSFT, OpenAI, Google and FB research, has transformed NLP into a large scale deep learning field. It's amazing you can encode that many params, go through that many documents, in 100+ languages. Even handle everything as bytes...
I didn’t know, apparently all epidemiological models used by governments, assume a form of homogenous mixing — ie people infect each other randomly, either by region, or by age group.
Of course it’s not remotely true. IRL social graphs are very sparse!
There’s plenty of theory and software on modeling sparse interactions. But the health officials aren’t that good at math…
So they use a model with some form of underlying assumption, that the population is many N sub populations — children, pensioners, Floridians — who all infect each other randomly and between populations at a different lower rate.
Updated the Punks model.
* fixed how we decay bids over time
* Hoodies & Beanies with better prices (median around 300Ξ)
* About 300 Punks in that 250-400Ξ ranges
* Bottom value 100.7Ξ, median 113.6Ξ -- has been steady
* Market cap 1.69mΞ ~ 6.5B USD
In the model, we obviously want to fit to bids (as minimum) and offers (as max) as well as predicting tomorrow's sale.
Realistically, this means decaying the weight on bids over time. Say a bid for past 3 day is 100%... do you still care about a bid (or sale) from month ago? Yes
The question: hows quick to decay the data.
We now keep bids around for up to 120 days (!) -- seems high, but no Beanie has sold for two months.
No evidence that prices have dropped -- many relative to median/floor but not in absolute terms.