One thing I really liked about it was that it suggests/summarizes actionable ways to check your own thinking, which I found useful even when it was discussing a way of thinking that I've had since I was a kid since I still mess up all the time and having concrete checks helps.
Another is that it does a really good job of laying out the case for various ways of thinking. There are 6 blog posts that were on my to-do list that I think I don't need to write anymore since the book describes what I wanted to describe, but better than I would've done it.
For example, a counter-argument to the idea that one needs to be hyper-optimistic and not acknowledge realistic probabilities of failure to be successful.
Julia's chapter on this uses a broader array of examples than I would've and is written to appeal to a wider audience
, but, in appealing to a broader audience, the chapter doesn't sacrifice intellectual rigorousness.
When that's possible, I view that as basically strictly better than how I write and I would write that way if I knew how in a time-efficient way.
One thing I find amusing is that a lot of the negative reviews on Amazon seem to conflate having a difficult to read style with rigor and they knock the book for being unscientific or shallow because it's easy to read, like
Despite increased centralization over the past 20 years, the internet feels a lot more like the wild west to me in man ways, e.g., the Google index hasn't kept up with the size of the internet, so an increasingly large fraction of the web is undiscoverable via search.
Even 10 years ago, I could basically always find old blog posts I'd read with Google.
Now, an exact string match search with site:[URL] frequently doesn't turn up the result and I have to wget the page and grep for what I'm looking for.
If the site's too large to wget and it doesn't have a custom index, I frequently can't find the page. Large commercial sites, like Twitter, sometimes build complete indexes, but it's a non-trivial effort to index something that's even 1/100th the size of Twitter, so most don't.
I feel like "regretted attrition" is a curiously bad stat to track considering how widely used it is.
On the one hand, it undercounts "attrition we shouldn't have had" by ignoring second order effects that cause people to become "unregretted".
When I've worked in orgs or companies that have low total attrition (~5%), non-regretted attrition has been something like 1% or sometimes as high as 2%.
When regretted is ~15%, non-regretted will be 5% to 10%. Most of that 5% to 10% wouldn't have non-regretted in a good org.
The same things that cause regretted attrition also cause people to burn out and do work that allows the company to call the attrition "non-regretted", but it's only non-regretted if you want to operate a company that sets people up to burn out and lose motivation.
One thing I've wondered about for a long time is why I fail interviews at such a high rate, e.g., see danluu.com/algorithms-int….
People who've mock interviewed me have a variety of theories, but I don't think any of them are really credible, so I'm going to wildly speculate.
The most suggested reason people have is that I get nervous and that's the problem, which people think because I do fine in their mock interviews.
That's a contributing factor, but I only get nervous because I've failed so many interviews and I didn't used to get nervous, so
there must be at least one other cause.
Another explanation that's consistent with the evidence is that when I say something "stupid sounding", people who mock interview me (who know me) assume it isn't stupid whereas interviews assume it is stupid, e.g.,
Is there anyone who's writing about different problem solving approaches / styles? An example of the kind of thing I mean (but, incomplete, because it would be nice to see more than two approaches to a problem and I'm only going to discuss two for this example):
Once, at a meetup Matt Singer was hosting, Brendan Gregg asked me what I was working on, and I mentioned that I'd recently written a little (5kLOC) parser to parse every line of every dmesg we had in our datacenters to audit machine health issues.
Of course, Brendan had done a vaguely analogous thing for Netflix and he showed me what he'd done, which was so much in his style that I think that if you saw the result without knowing who did it, you'd say "wow, this looks like something Brendan Gregg would make".
Is there anyone doing in-depth interviews on various aspects of why the world is the way it is?
Some examples of interviews I'd like to hear below
Looking for interviews because I don't think one person could have the breadth & depth to regularly answer these kinds of questions
How is it that Michelin has generally had either the best in class tire or close for every class of tire they make for decades?
Perhaps this isn't inherently more mysterious than the effectiveness of Apple's CPU design group, but I don't know who I could ask about Michelin.
Why has non-OC canoe tech stagnated relative to kayak tech?
There's the obv. answer that there's more $ in it, but I want to know why specific innovations that seem like they should be portable are super niche, e.g., the stuff Nick Adnitt is doing, or GRB's curved blade paddle.
If I want to fully support myself from my blog, is substack basically the only reasonable game in town? I'd like that to not be the case, but it seems like it might be?
From numbers people have posted, substack has a much higher conversion rate for writing than patreon, GH, etc.
It seems like 10% isn't an uncommon conversion rate, which seems incredibly high if you compute what the equivalent number would be for a blog that's supported via Patreon or GH sponsors.
You can try to make up the difference by adding higher tiers, like Andy Matuschak has, but
substack also supports tiers and, to make up the difference in conversion, you'd need very high tiers, like Evan has for vue.js support.
Evan does get sponsors for the high tiers, but they're corporate supporters, which isn't something you can expect for a programming blog.