It was mainly the worst of times, when a comedy of errors left me on a prolonged job search, alternating between the spring of hope and the winter of despair. But there is lemonade to be made from the almost three dozen lemons interview loops I went through, and this article is that lemonade. Because, while astronomical amounts of money go into recruiting, and even though we live in the age of wisdom, most of the interviewing process is sorely broken at the majority of companies; and so we also live in the age of foolishness.
From my experience, most Big Tech companies more or less copy each other, and because they are giant ships that are impossible to turn quickly, change comes slowly. Many startups and smaller companies then look to the giants for inspiration, out of some combination of familiarity (because they used to work in Big Tech) or imitation (“we’re also Big Tech!”) or just assuming that anything Big Tech does must be good, because one hundred billion dollars can’t be wrong.
And so we end up with the typical tech interview loop:
- Sourcing: usually from a referral, but could be a recruiter reaching out after a LinkedIn search or, rarely, actually responding to an unsolicited application
- Recruiter screen: to do a sanity check and vet logistics (salary range and other expectations, like remote work)
- A general aptitude screen: coding or system design or leadership chat or some combination, depending on the role
- A panel, or “virtual on-site”: somewhere around 5 interviews scheduled all at once — but maybe not happening on the same day — with a mix of roles, from engineers to product managers to engineering managers to designers to QA
- If that went well, a final interview, with a senior leader, and then an offer
Some variant of this describes almost all of the ones I’ve come across, but: the devil is absolutely in the details, and those details are the difference between a great process and a terrible one. So let’s look at two hypothetical loops at those extremes.
The Bad Loop
If you’re hiring, and elements of the bad loop sound like your company, please think about improving them. If you’re looking, and you run into one a little too close to this: I’m sorry.
Sourcing
The bad interview loop usually starts off badly from the beginning. It’s obvious they don’t invest in candidate experience up front, and it rarely gets better further down the process.
There are a few ways to screw up sourcing, and the most common is just never getting back to candidates. No rejection on merit, no timeout rejection, no insulting GIF — just silence. This happens a lot.
Marginally better are companies that do reply, but after way too long. The worst of these for me was a startup that got back to my application — which was a referral — after 47 days. They set up a recruiter screen for the next day, which went well, and then an interview with the VPE for the following week. Unfortunately, that got rescheduled for the next week… and again, for the next week, and then again for the next week, and when it got to the 4th time, I decided to bow out of the process. After 84 days total.
The third common way to turn people away at this step is to send clearly automated emails or LinkedIn messages that are supposed to sound tailored to you. Like the recruiter is channeling a horoscope writer.
Hi Gabriel,
Your profile looks great. It looks like you are thriving at Hillrom, so I know it is a long shot that you’ll be interested, but hope springs eternal!
[Recruiting company] has been retained to recruit an Embedded Systems Manager at [hiring company]. This person will lead the embedded systems team comprised of electrical and software engineers while also serving as a software lead on connected care and digital health projects.
I’d welcome the opportunity to talk with you. Please let me know if you are interested to learn more. Thanks in advance and I look forward to your response.
An actual email I got. I’ve never worked with embedded systems.
In short, the sourcing is bad if they:
- Don’t value candidates, and
- Don’t value the company’s image
Recruiter Screen
This step is actually pretty hard to do badly, but it generally involves the recruiter being out of their depth and/or not caring about their craft. I’ve spoken to both (a) recruiters who had no idea there was a difference between Java and Javascript, and (b) ones who did, but treated our conversation like the most annoying part of their day — like their Zoom background might as well have a giant “I’d rather be doing literally anything else” banner. And of course, the worst is (a) and (b) combined:
Recruiter: yeah hey, I’m in a rush, but this won’t take long
Me: okay…
Them: so I saw your resume, I don’t have it in front of me right now, but can you just answer a few questions?
Me: sure
Them: ok so this is a management role, you have 5+ years of experience managing, recruiting, and mentoring a team of diverse engineers?
Me, realizing they have, in fact, not seen my resume: yeah
Them: do you have 8 years of Java?
Me: uh… [trying hard to make it past the syntax errors] … yeah
Them: do you thrive in a fast-paced environment?
Me: sure
Them: can you speak good English?
Me: I think you said the quiet part out loud.
General Aptitude Screen
After the boilerplate kind of checks, comes the first true test of the candidate’s mettle: do they actually have a notion of how to do this job, or are they a complete fraud? For engineers, this is almost always a somewhat basic (sadly, often leetcode-y) technical interview. For managers, it’s one of three things:
- A purely technical interview, because the thinking is managers should also know how to leetcode
- A leadership interview
- A bit of both
The bad versions of this are the ones that don’t test aptitude required for the job. We’re hiring a staff engineer for their background in designing APIs and scalable architectures, but first: let’s make sure they can figure out the trick to finding the longest palindromic substring. Because that’s the thing they’ll constantly trip over in their day-to-day job: figuring out solutions to neat little puzzles that can be done in 20 minutes.
And that’s really the crux of the problem, because this kind of exercise filters out a few kinds of traits that, in my experience, a lot of good engineers have:
- Not performing well while being watched
- Not having done algorithmic puzzles in years
- Not particularly caring to spend dozens of hours preparing for interviews
- Not having slept well the night before the interview
The arguments for leetcode interviews don’t hold water either:
- It’s fair: everyone from the new bootcamp grad to the Sr. Fellow gets the same questions. Is that fair though? It’s fair for the new grad, because they have no work experience and it’s all they can be tested on. And early in their career, they should still remember this stuff. But it’s not fair to someone with a decade of experience that (correctly) hasn’t thought about sorting anything beyond typing
my_list.sort()
in almost exactly as long. - It shows motivation: the thinking is that if you’re willing to put in the hours to dust off long-forgotten skills, that means you’re a go-getter. Either that, or you have a ton of free time. Or are willing to work hard for 3 months so you can rest and vest.
- It’s a great weeding-out tool: “here at Hooli, we get far too many yahoos applying and we need a simple and consistent way to halve the middle of the funnel.” This I can’t argue with. If you have too many applicants and just need an efficient way to politely send scores of them away, while making it look like meritocracy, then this loop design does in fact meet those requirements.
Panel
The next, and sometimes final step of the process tends to be an interview panel. It’s kind of like a death panel, except it’s very much real, and only decides your professional and financial future. It’s a holdover from the days of in-person interviews, which you can tell because it’s often called a “virtual on-site”.
Back in the before times, companies used to fly people over to their headquarters, even across country, for a gauntlet of interviews held over a full day. You’d talk to 3-7 people across ~4 interviews — because sometimes they’d double up, or you might even have several people in one interview. Like much in-office culture, most companies have lifted and shifted this process into the Internet, and now offer the same experience, but worse because it’s over Zoom. Also better though, because you don’t have to fly to every interview.
The idea of having several people interview a candidate is obviously fine. The problems here are more around how they’re structured, and how much freedom the candidate has to tailor this step so they can make a good impression. Because to a lot of people, if not most, this step is very stressful. Some would rather do it in one day and get it over with (even if they’d perform better otherwise) and some would rather split it into two or three days. Companies that don’t care, also don’t care about candidate’s preferences here.
The more general problem though, that many companies are guilty of, is not adapting this step to the modern world. It’s not great for the company to sign up for more than ten people-hours of time (adding in the prep, write-up, and debrief time) just based on the one aptitude screen. Especially if it’s a low-signal one, like leetcode. Data is king here, but I would bet the panel pass rate at most quality companies is not that high, and so that adds up to a lot of waste all around: effort and stress for the candidate, plus time and energy for the interviewers.
The last thing to mention here is the companies not investing in quality tools, especially for the technical interviews: having the candidate write code in something like Google Docs instead of a purpose-built thing, like CoderPad or HackerRank. Same goes for whiteboarding, for like system design exercises. And using Microsoft Teams.
Final Interview
This step, mirroring the recruiter screen, tends to be more of a formality or sales tool. By now, the candidate has passed the panel gauntlet, and this final hurdle gives good candidates the opportunity meet with a senior leader in the organization, while also giving that leader the opportunity to meet with promising hires before the offer goes out.
Where this is done poorly, it’s either not done at all, or it’s a very rigorous interview. Not doing one at all can be fine, but — especially in a more talent-driven job market — having this step can sway a candidate with multiple options. It’s also great, from an organizational health perspective, for the most senior leader to either be part of the interview loop, or schedule a 1:1 shortly after the candidate starts; doing both is even better.
This interview should be real, of course, with questions that give the interviewer insight into the candidate; but if it’s too much more than formal, it can both put the candidate (who has already been through the gauntlet) off, while also causing an internal problem if the candidate fails a hard interview, after having already passed the panel. Can the more senior leader just override the panel? Because that doesn’t sound like a healthy org.
Surprise Step!
Sometimes, panels are split. Lots of soft thumbs up and down. And the job req has been open for months, so they don’t want to pass on a potentially good candidate. And so they decide to tack on another interview — or maybe repurpose the Final Interview above — in order to break the tie. This is a smell of a poor interview process. One that is not designed to produce strong yeses and nos. Or one that is, and has failed. Most well-run processes will explicitly forbid this, and for good reason: an 8th person won’t add that much value to the chorus.
The Good Loop
Now, for the fun one! Sorely few of the interview loops I’ve experienced have been good, but for the ones that have, it provided so much signal that it would’ve been worth taking a lower offer. A lot of the anxiety of committing to a role comes from the company’s culture, and a great culture usually comes out through the interview loop.
Sourcing
Great sourcing is genuine. Also the things I said earlier about valuing the candidate and the company’s image. But great places to work do it with care. The messages are personal, not form- or AI-generated. They show that the sourcer has actually read and understand your resume, as well the job description. And they make a good case for why they think you’d be a good fit and choose to go through their particular interview gauntlet.
But good sourcing is hard. To do this well, it obviously takes a lot more time, and therefore people, and therefore money. So it’s not a bad signal if a company is just okay at this. BUT, if the sourcing really is great, that’s great signal: it shows the company has committed actual dollars to candidate experience, because they care about attracting the best employees into a great environment.
Recruiter Screen
A great recruiter screen gives the candidate a positive glimpse into what the rest of the process will be like. The recruiter is prepared, understands what the candidate’s resume or LinkedIn says, and has good follow-up questions. The chat is friendly but organized and the run-down is provided up-front: “first I’ll give you an overview of the company, talk a little more about the role and give you a chance to ask questions, and then talk about your experience and how it might be a good fit.”
It’s mostly proforma, but value comes out of it: the recruiter gains more clarity than the resume provides, makes sure the logistics are lined up, and the candidate asks questions about the team/org, responsibilities, tech, etc that aren’t in the job description. Even if it turns out that something doesn’t line up, like salary expectations, everyone leaves feeling like it was a productive call.
General Aptitude Screen
An effective aptitude screen tests just that: how well this person is suited for this role. This means that the defining characteristics of the role have been thought through and distilled into discoverable skills; and then questions have been created which are good at discovering those skills. But an important piece of the puzzle is where the confidence comes from, that those questions illuminate the correct things. In most places, it’s just someone’s gut feeling.
We make social networks here at Hooli, so we need to make sure people understand Dijkstra’s algorithm. We’ll ask a question where the obvious solution is the algorithm, and pass people that either use it in the solution; or they mention it, have a good reason for not using it, and still come up with a good solution.
Hypothetical, but I’m sure someone somewhere has written interview questions like this.
Some problems with the above:
- Do successful employees really need to understand Dijkstra? Is there a correlation between that and job performance? Is it important for this particular role and experience level?
- To most candidates that are familiar with Dijkstra, is it obvious that it’s the right solution here? Does the wording and the problem statement lead them there? Or is it just obvious to the people that came up with the question?
- Can the question actually be favorably solved in another way? Are we too narrowly targeting a particular way of thinking?
Good questions have been purposely designed all the way through. Their goal has been validated by data as being good predictors of employee performance. The questions themselves have been vetted by trying them out on actual people, getting their feedback, and looking at false positives and false negatives. Ideally, those people aren’t just employees — though this, of course, is difficult.
The above applies to questions asked as part of the panel below. The essence here is that a great aptitude test is effective at predicting job performance.
Panel
I’m not sure what percentage of rejections in the interview process are false negatives — meaning a perfectly good employee has been turned away — but I would bet a small amount of money that it’s significant. As in, more than a quarter. And I’d also bet that a lot of that has to do with the candidate being nervous. And I’m not alone.
Chris Parnin of NCSU has a great paper in ACM on the matter, pointing out two things we all already know, intuitively:
- An estimated 40% of adults suffer from performance anxiety, and
- The typical software interview “has an uncanny resemblance to the Trier social stress test“
For those of you who haven’t clicked on that link, that stress test was created to induce change in physiologically measurable stress markers, like heart rate, cortisol, and other things. The test makes those markers increase, in most people, anywhere from 30-700%. And how does it do that? It’s a simple, three part process:
- You have 5 minutes to prepare a 5 minute presentation for a job interview
- Ok, it’s time for the presentation, but: surprise! Your presentation materials have been taken away.
- Also, the judges will be silent and maintain only neutral expressions the whole time
- Count down from 1022, by 13s. If you make a mistake, start again from 1022.
“Uncanny resemblance” indeed, right? It’s basically the same thing as software interview, which gives the candidate a complex problem — much more so than counting down by 13s — except they don’t even have time to prepare! And the typical interviewer is cold and silent. (As an aside, please don’t be that interviewer. Read my Art of Interviewing, if you are.)
What’s more is that he says “scientific evidence finds that women experience disproportionately more negative effects from test and performance”, and while I’m not saying that this is the only reason tech might be so heavily male-dominated, I wouldn’t be at all surprised if it turns out to be a big factor.
So what’s a company to do? The above paper has one suggestion — and I think it’s a special case of a more general solution — which is the “private interview”. That is, rather than give the candidate a problem and watch them solve it, give them the problem and go away, letting them solve it in private. When they ran this experiment, the results were undeniable:
In the public setting 61.5% of the participants failed the task compared with 36.3% in the private setting. […]
Interestingly, a post-hoc analysis revealed that in the public setting, no women (𝑛 = 5) successfully solved their task; however, in the private setting, all women (𝑛 = 4) successfully solved their task—even providing the most optimal solution in two cases. This may suggest that asking an interview candidate to publicly solve a problem and think-aloud can degrade problem-solving ability.
from “Does Stress Impact Technical Interview Performance?”, in Association of Computing Machinery
The private interview is a great idea, and an improvement on the “take home” challenge, which suffers from the fact that some people will spend an hour on it, and others will spend ten, with two friends helping. But, there are problems with it. One that the paper mentions is that some people were frustrated by not having access to the proctor, to ask clarifying questions and be kept from drifting. So it’s not a one-size-fits-all solution.
But what’s the philosophy behind the private interview? That most people get too stressed out by a stress test an interview, and doing it in private helps. Most people though: not all people. And that, I think is the key insight: what the paper calls “provide accessible alternatives”.
Great companies will tailor their process to help the candidate succeed. They will allow the panel to be split up over as many days as there are interviews. They will allow the candidate to pick the programming language. They will have a “live” option and a “take home” option — I have yet to see the “private interview” option, though it’s by far my favorite. They will train their interviewers to be kind, to be expressive, and to focus on the person, not the question.
In short, as with all good science, they will make sure they have adjusted the process to eliminate as many confounding factors as they can, in order to minimize false negatives and false positives, and give everyone a fair chance.
Final Interview
There’s not much left to say about the final interview other than that a good leader will leverage them to motivate a promising candidate. So instead, I’ll summarize the two loops:
- bad: disinterested people going through the motions of a process that will usually produce some result
- good: thoughtful, kind, well-trained people assessing how well this candidate might fit into this role