One thing I've wondered about for a long time is why I fail interviews at such a high rate, e.g., see danluu.com/algorithms-int….
People who've mock interviewed me have a variety of theories, but I don't think any of them are really credible, so I'm going to wildly speculate.
The most suggested reason people have is that I get nervous and that's the problem, which people think because I do fine in their mock interviews.
That's a contributing factor, but I only get nervous because I've failed so many interviews and I didn't used to get nervous, so
there must be at least one other cause.
Another explanation that's consistent with the evidence is that when I say something "stupid sounding", people who mock interview me (who know me) assume it isn't stupid whereas interviews assume it is stupid, e.g.,
My feeling is that this comes from my being an experienced engineer with a non-traditional background, which means there's a lot of stuff that people expect "all programmers" know that I don't know, which makes me sound like a fake programmer, as in danluu.com/culture/#appen…, e.g.,
I've never written production code that talks to a DB nor have I ever written production code that issues an RPC or calls an external API.
This kind of thing elicits the strongest reactions in design interviews. Once, I asked someone what MVC was, which didn't go over well (I'd
heard of MVC before, but I've never worked on anything where that would be necessary knowledge).
Something like that comes up in every "design a webapp" interview, which has been the vast majority of my design interviews.
I think I've only had three design interviews that didn't leave the interviewer with a negative impression. The three interviews were "evaluate a fundamental new disk technology and how that impacts the datacenter", "design an ad exchange", and "design the chat UI for X"
At the time I was asked the first question, I'd never done work in a datacenter, but evaluating performance has been something that's been in my wheelhouse since I was in high school, so that one was an easy one.
The other two questions were fine because, in both cases, the interviewer was prepared to interview someone with no relevant knowledge (non-ads and non-frontend/UX folks) and didn't assume the candidate having a gap in their knowledge w.r.t. the topic meant they were an idiot.
I think this is very unusual; when people say "design interviews let me see how someone thinks" what they unknowingly mean is "design interviews let me see how much relevant knowledge someone has, and if someone has the relevant knowledge, I'll evaluate how they think".
Although people often say they're looking for reasons to pass a candidate and not for reasons to fail a candidate, many people (unknowingly to themselves) view missing some "standard" knowledge as a reason to fail someone and, in fact, are looking for many reasons to fail someone
BTW, I think design is one of the areas where I add outsized value at work.
When asked to review a design, I frequently find issues that would cripple or kill a project if unaddressed (I also frequently don't find anything, another side effect of my background, I think).
A few times, I've been unable to convince the implementing team that an issue I thought would be fatal was actually serious and they went forward with their design and the issue has been fatal every time, which gives me some confidence I might be right when I make that call.
Another component thing goes badly is whiteboard coding, but probably not how most people would expect.
I'm extremely out of practice at this now but, in 2013, I did some competitive programming problems for fun.
Competitive programming problems are so much harder than what you get in interviews that, if you can do semi-easy competitive programming problems, a standard whiteboard algorithms question should be solvable at the speed at which you can write down the solution.
That didn't appear to really help me, though. At Palantir, the interviewer literally ran out of questions (IIRC, he had 5 or 6) and seemed embarrassed.
Palantir walks you out of the interview early in the day if they fail you early and they walked me out after that interview.
Another failure mode was that people would accuse me of having seen and memorized the question because I answered it too quickly. That wasn't the case (I was very up front in cases where I'd heard the question before), but people literally didn't believe me.
Many folks advised me to pretend to struggle with the question and then have an a-ha moment since, when people say that an "interviews show you how someone thinks", that's what they're actually looking for, not how you really think
But I don't like the idea of lying, so I didn't
I actually seemed to do a lot better in algorithms interviews a couple years later, when I was moderately out of practice and would struggle a bit instead of just being able to write out answers at the speed I can write on a whiteboard.
The last time I had coding interviews (4 years ago), I was out of practice and failed a number of interviews because I struggled too much.
It's much harder to actually struggle the right amount than to fake struggling the right amount, which is why so many of my friends fake it.
Another thing that I believe counts against me is that I'm candid and honest in behavioral interviews. I know what you're supposed to say but I refuse to say that instead of the truth on general principles, which I suspect has cost me a number of jobs.
I've been on the other side of the table and seen how hiring managers and other folks react to real candor to their questions instead of saying the thing "everyone" knows you're supposed to say.
I've tried to stand up for this but have generally been outvoted.
A concrete example is that the hiring manager panned a candidate because, when asked, said some negative things about their employer, which allegedly indicated that the person was negative, could be difficult to work with, didn't have a growth mindset, etc.
I pointed out that I was on the loop because I'm considered easy to talk to & work with and I answered the same question v. negatively. The HM said I must've shown a positive, growth oriented, mindset and my manager (also on the loop) slacked me with, basically "you did not".
I think that biasing towards hiring people who game their answers and say what "you're supposed to say" and away from people who are honest is a poor hiring strategy but, by revealed preference, we can see that most hiring managers want to hire people who are dishonest.
Two people who know me independently pointed out that the remaining 20% of this might be why I fail interviews at such a high rate, calling out the same specific mannerisms. I wish I'd had these friends 20 years ago, but
I'm not sure it's worth the effort to do anything about it at this point.
The remaining things would be fairly high-effort to change and, at this stage of my career, the majority of companies I talk to will give me an offer without putting me through a normal interview.
I think this is multi-causal, but this bit would explain why, e.g., when I smoke an algorithms question, people are often utterly convinced that I memorized the question even though competitive programmers also smoke algos questions and generally believed to be genuine.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I feel like this is true for lots of kinds of conversations and not just tech interviews.
People are correctly pointing out that, if you dig into the logic of basically anything, it falls apart, but that's also generally true of actual humans, even experts.
is ridiculous, but have you tried asking an expert coach on almost any topic why you should do X?
E.g., try listening to one of the top paddling coaches in the world explain *why* (the what is good, the why is nonsense)
Why do you let the boat "run" between strokes? The explanation is because the boat is moving at top speed when you remove the paddle from the water, so putting the paddle back in slows you down.
But of course this is backwards. The boat decelerates when you're not pulling,
What are examples of items/categories where you're really getting your money's worth at the high end, not necessarily in terms of utility, but in terms of the difficulty of producing the item more cheaply?
I find the contrast between these vs. "brand" items fascinating.
An example of a category that doesn't qualify but where some items qualify would be high-end fashion, where you're quite often mostly paying for the brand (e.g., an expensive Theory shirt) but there are plenty of items where you're paying for the item (e.g., a $5k Kiton suit).
An example of a category would be high-performance cars (with the notable exception of a few very niche brands like Ferrari, which are famous for having very high margins).
Even if you look at brands that laypeople consider to be "brand" purchases, like BMW,
Lots of people in my mentions saying things like "Elon is cleaning house! Lazy bums are getting what they deserve!", as if Twitter employees are getting a much deserved comeuppance.
Since people don't seem to understand what the bums at are getting, here's a short primer:
If you look at the people most responsible for Twitter's state, leadership, they had golden parachutes worth tens of millions of dollars
We can debate whether or not they deserve the money, but if you think someone is a lazy bum, cursing them to receive a $10M+ payout seems odd
If we're talking about engineers, Twitter has historically underpaid long-tenured employees relative to BigCo market rate.
The median raise the staff+ people I'm talking to are getting in their new offers is six figures.
One of the things that I think is sad about the decimation of Twitter eng is that Twitter was doing a lot of interesting (and high ROI) engineering work that, at younger companies, is mostly outsourced to "the cloud" or open source projects
The now gutted HWENG group was so good at designing low power servers that, in a meeting with Intel folks, discussing reference designs vs. what Twitter was doing, the Intel folks couldn't believe the power envelope Twitter achieved.
Twitter was operating long before gRPC existed, so they built Finagle. kostyukov.net/posts/finagle-… has some nice explanations and there's been a lot of innovation in Finagle since then.
Twitter still gets a lot of mileage out of owning its RPC layer
Nice thread about the misconception that major tech companies run systems that can run without intervention because they're automated
The example comes from Google, which is more automated compared than most major companies (MS, etc.), but still quite manual in an absolute sense t.co/diqwJ3RHZH
One thing that's been interesting about recent events is seeing how people imagine big companies operate, e.g., people saying that Twitter is uniquely bad for not having a good cold boot procedure.
Multiple $1T companies didn't or don't have a real cold boot procedure.
One of them is one of the most respected eng orgs on the planet and SREs there wonder if it would take weeks to come back up or months.
As someone who thinks a lot about risks, this isn't how I want the world to be, but it is how the world actually is.
An interesting thing about this claim is that not only is the implication wrong, Twitter probably has better evidence of its wrongness than any other company in its size class could have.
There are very few companies that have a better distributed tracing setup w.r.t. getting actionable insights on the backend and the ones that have a better setup are much larger (Google, FB, etc.)
Twitter client tracing also punches above its weight.
Of course, the key people who did that work left or got laid off, but it's clear from the data that, if you're looking at why Twitter is so slow in, e.g., India, Uganda, etc., esp. on slow devices, tail latency comes from the network due to unreasonably large payloads + client.