To better understand the technical hiring pipelines, I analyzed 15,897 interview reviews for 27 major tech companies on Glassdoor. I focused on interviews for software engineering related roles, both junior and senior levels. These are some of the main findings. (1/n)
Each review consists of:
- result (no offer/accept offer/decline offer)
- difficulty (easy/medium/hard)
- experience (positive/neutral/negative)
- review (application/process/questions)
The largest SWE employers are Google, Amazon, Facebook, and Microsoft.
Strong correlation bw onsite-to-offer rate and offer yield rate (% of candidates who accept their offers). The more selective the company is, the less likely a candidate is to accept their offer. Candidates that pass interviews at FAANG are likely to have other attractive offers.
To read the graph^: 18.83% of onsite candidates at Google get offers, and out of all those with offers, 70% accept. Due to the biases of online reviews, the actual numbers are much lower. The most selective companies are Yelp, Google, Snap, and Airbnb.
Referrals matter, a lot. For junior roles, about 10 - 20% of candidates that get to onsites are referred, with Uber leading the chart with almost 30%. For senior roles, that numbers are higher. Salesforce, Uber, and Cisco all have ~30% of their senior onsite candidates referred.
For junior roles, the biggest source for onsite candidates is campus recruiting. Microsoft & Oracle have >50% of their interviewees recruited through campus events. Google, Facebook, and Airbnb rely less on campus recruiting, but it still accounts ~20- 30% of their onsites.
This means big tech companies concentrate their recruiting effort to a handful of popular engineering schools. Students recruited from those schools then refer their classmates, who in turn refer even more classmates, turning those major tech cos into a Tech Ivy alumni mixer.
Everyone complains that the interview process is broken. It’s not entirely true, at least from the perspective of the candidates who get interviews. 60% of candidates report a positive interview experience.
Candidates with offers are more likely to have a positive experience (correlation 0.75). Companies that give the best candidate experiences are Salesforce, Intel, and Adobe.
The more negative experience a candidate has, the less likely they are to accept the offer. If a candidate who receives an offer has a positive experience, they accept the offer with probability 87%. If that candidate has a negative interview experience, the yield rate is 1/3.
Senior candidates are harder to please than junior candidates. This might explain the abysmal Netflix interview experience. While all other companies keep their shares of senior interviews to under one third, Netflix exclusively hire senior positions.
Companies with the hardest interviews (as perceived by candidates) are Google, Airbnb, and Amazon.
When talking to people who haven’t deployed ML models, I keep hearing a lot of misperceptions about ML models in production. Here are a few of them.
(1/6)
1. Deploying ML models is hard
Deploying a model for friends to play with is easy. Export trained model, create an endpoint, build a simple app. 30 mins.
Deploying it reliably is hard. Serving 1000s of requests with ms latency is hard. Keeping it up all the time is hard.
(2/6)
2. You only have a few ML models in production
Booking, eBay have 100s models in prod. Google has 10000s. An app has multiple features, each might have one or multiple models for different data slices.
You can also serve combos of several models outputs like an ensemble.
I've been talking to a lot of people looking to join/having joined startups and I'm flabbergasted by how often people think joining startups is a get rich quick scheme. Here's the math why it doesn't work and what to look for when joining startups. (1/n)
Equity: anywhere 0.001% - 10%. A friend recently joined a 15-pax seed startup that offered 4%/4 years + lot of $. He'd be the ML engineer. They need him to raise A. It looks good on paper but do you want a company where you're clearly the best at what you want to learn? (2/n)
For startups with product-market fit, star founders, top VCs (think Asana, Zoom), if you're the ~15th engineer, expect equity << 0.1%/4years. After subsequent rounds, it's diluted to < 0.05%. If startup is sold for $1B, which is rare, you'd make < 0.5M/4years. (3/n)
To learn how to design machine learning systems, I find it really helpful to read case studies to see how great teams deal with different deployment requirements and constraints. Here are some of my favorite case studies.
Topics covered: lifetime value, ML project workflow, feature engineering, model selection, prototyping, moving prototypes to production. It's completed with lessons learned and looking ahead!
Netflix streams to over 117M members worldwide, half of those living outside the US. The company uses machine learning to predict the network quality, detect device anomaly, handle predictive caching. medium.com/netflix-techbl…
This thread is a combination of 10 free online courses on machine learning that I find the most helpful. They should be taken in order.
1. Probability and Statistics by Stanford Online
This self-paced course covers basic concepts in probability and statistics spanning over four fundamental aspects of machine learning: exploratory data analysis, producing data, probability, and inference. online.stanford.edu/courses/gse-yp…
2. Linear Algebra by MIT
Hands down the best linear algebra course I’ve seen, taught by the legendary professor Gilbert Strang. ocw.mit.edu/courses/mathem…
I'm working on a book on machine learning interviews so I've been spending the last few months talking to companies about their hiring process for ML roles. This thread is a summary of what I've learned. It will be updated as the book progresses. (1/n)
The average interviewer gets very little training. You start your full-time job. You shadow a few interviews. Then you're on your own. As a result, interviews are wildly different even within the same company.
I ask interviewers to give me their most frequently asked questions, and show them questions that other interviewers ask. I notice the pattern that if an interviewer doesn't know the answer to a question, they immediately flag it down as bad.