The science of interviewing developers
I’m just about out of patience with LinkedIn posts where a freshly-minted tech CEO says “instead of a week-long interview process and several assessments, I just have a conversation with the candidate. I can tell in about five minutes if they’re right for the job.”
No, they can’t. Although shorter interview processes are a good idea (excessively long ones will scare away the most sought-after candidates), the “conversation over coffee” interview approach is no better than choosing candidates at random. And that’s pretty bad.
What we’re witnessing from LinkedIn CEOs is not extraordinary intuition or social skills. It’s the Dunning-Kruger effect: people tend to overestimate their own competence at a task, and the worse they are at it the more they overestimate. Interviewing is one of the most well-studied examples. There’s not a single person in the world who can assess someone’s job skills in an unscripted five-minute conversation—or, for that matter, a much longer one. Unscripted conversations aren’t interviewing; they’re speed dating. Yet employers stubbornly believe they’re the most effective way to select high-performing candidates.
Study after study after study has proven them wrong. Unstructured interviews increase multiple types of bias, open the door to interviewer idiosyncracies, and reduce hiring accuracy by over half. They’re worse than useless, and not just by a little. And the impact of hiring mistakes is massive: the wrong candidate can cost up to a quarter of a million dollars and cause significant delays in the company’s projects.
Programming and evidence of ability
For programmers, bad interviews are especially frustrating. Programming is a task that produces self-evident results. You can’t assess the build quality of a bridge or the preparation of a meal just by looking at it, but with code there’s nothing under the surface: what you see is all there is. You can demonstrate your ability to write good code in real-time without any complex setup or tangible risk. Yet companies go out of their way to either avoid such a demonstration or ask for one that’s completely tangential to the job they’re hiring for.
Most of us have been there and done that. You’re interviewing for your first programming job and they want to know what element on the Periodic Table best describes your personality. You’re an experienced front-end developer but the interview is an hour-long “gotcha” quiz on closures and hoisting. You’ve built several .NET applications from scratch but they want you to invert a binary tree. You’ve contributed to the Linux kernel but now you have to guess the number of ping-pong balls that would fit in the interview room. The whole thing has been memed to death.
This is especially bewildering in context of the job market. Programmers are worth their weight in gold. Salaries are skyrocketing throughout the industry. Remote work has become table stakes. Tech startups are offering four-day work weeks to make themselves more appealing. Recruiters are second only to car warranty scammers in their efforts to reach us. But it all goes to waste if the candidate who gets hired isn’t able to do the job. So why aren’t companies investing more at the tail end of their hiring process?
An optimized interview process
Let’s talk about what an ideal coding interview could look like. If someone’s job is to write Java, the process might go like this:
- Disclose the salary range and benefits in the very first contact.
- Review their resume ahead of time for professional or open source experience writing Java (any other programming language would be relevant; Groovy and Kotlin would be equivalent experience).
- Watch them write Java for an hour or so in the most realistic environment possible. Grade them on predetermined, job-relevant attributes, e.g. problem-solving, null safety, error handling, readability, naming conventions, and encapsulation.
- Give a scripted, completely standardized assessment to ensure they understand the JVM and basic command-line tools.
- Let them know when to expect a followup call.
That’s it. No unscripted “culture interview,” no free-for-all “group quiz,” no “compiler optimization test,” no asking the candidate to build a 15-hour project from scratch in their spare time. You assess their job-specific skills and you’re done.
It’s not that I think coding ability (or any one skill) is the only important thing about a candidate. Of course you want to hire people who are reliable, smart, honest, and kind. But you have to restrict yourself to what is knowable. And in the context of a job interview, there isn’t much. Most candidates act fake in job interviews and a significant percentage of them lie. Even putting honesty aside, 9 out of 10 people report having interview anxiety. If you want to know how someone behaves at work, you simply can’t rely on the interview. That’s not the real them. Obviously if you see any major red flags—being rude to the receptionist, say—you can throw out their resume. But those are rare.
That’s part of the inherent risk of hiring: interviews just aren’t as powerful as we’d like them to be. Many employers try valiantly to devise interviewing methods that predict soft skills and personality traits. But anything outside the limits of practical, measurable work is usually an illusion, albeit a compelling one—a 2013 study found that interviewers form highly confident impressions of candidates even when those candidates’ responses are randomly selected! People tend to trust their first impressions regardless of how accurate or inaccurate they are, which leads to mistakes in the hiring process. Any time you ask humans to intuit facts about other humans, you decrease the accuracy of your process. The only thing you can measure with any accuracy at all is task competence.
Methods for testing coding ability
So, if it’s the best we can do, what’s the ideal way to measure programming skills? Several commercial products offer a way to test candidates with pre-written, auto-graded tasks in a code playground. Although they’re likely much better than the norm, I’m not here to evangelize any of them. Why would you pay for another company’s general-purpose assessment when your entire development team does specific, real-world “assessments” all day long? The last time I was asked to devise a coding test for an interview I looked over my recent work, picked a section of code that went well beyond “glue code” but didn’t get into anything proprietary, tweaked it so it would stand on its own, and copied the function signature into LINQPad (a code playground). The interviewee’s task was to implement the function according to nearly the same requirements I’d needed to meet. And when they were finished, I knew if they could do the job because they had just done it. It’s hard to get a more realistic assessment than that.
On the other side of the table, by far the best interview process I’ve ever gone through was with a startup who put me on a one-week contract (an “audition” as some companies call it) to work with their team. They assigned me a task; I cloned their repo, wrote code over the course of a couple evenings, and submitted a PR; the rest of the team gave feedback; I did some more work to bring the PR up to their standards; and they merged it and paid me at my regular rate. Even though I didn’t ultimately accept their job offer, everyone walked away happy—they got some extra work done and I got a paycheck. If I had accepted, there would have been no uncertainty about my skills or ability to work with the team. That kind of organizational competence is hard to come by. Years later, they’re still on my shortlist if I’m ever on the hunt for a job.
Maybe that doesn’t fit your definition of “interview.” I say it’s something much better than an interview. Can you imagine trading that experience for 15 minutes shooting the breeze with their CEO? It’s a ridiculous proposition. Their CEO doesn’t even code.
For security or policy reasons, some teams may not be able to offer such a highly practical interview process. In those situations, consistency is a close second to realism. If you can’t make the interview look like the job, at least make one interview look like the next—give yourself the ability to compare candidates on the same metrics by following the same script every time.
Helping every candidate be at their best
A consistent, scientific interview process doesn’t have to be completely rigid. You may actually hurt yourself by trying to standardize things that aren’t relevant to what you’re measuring; every candidate has different needs. For example, immunocompromised candidates need to have the option to interview remotely to protect their health. Candidates with physical or neurological conditions may fare better if the interview is split into short segments with frequent breaks rather than a three-hour marathon. Some candidates with interview anxiety may appreciate a moment or two before the interview to establish a friendly rapport with the interviewer; for others, small talk is difficult and they’d prefer to skip that part. You should always be willing to adjust the environment or cadence of the interview to the candidate’s needs. Accommodations like these are a win-win because they allow each candidate the best opportunity to demonstrate their skills. If you want to hire the best candidate, you need to see the best of each candidate.
Also, interviews are a two-way street. You’re being measured just as surely as the candidate is. Neglecting to answer their questions because “it isn’t in the script” would be a mistake.
How do you reconcile your need for objective, comparable data with your candidates’ needs for reasonable accommodation and reciprocity? The simplest answer comes from the robustness principle, a famous rule of software design: “be conservative in what you do, be liberal in what you accept from others.” To apply it here, you should assess candidates conservatively (narrowly) on the parts of the interview that are standardized and focused on relevant skills—but you should be liberal (flexible) about whatever else they may need. You’ll still get the data you want and your candidate pool won’t be restricted or misrepresented by unnecessary rules.
Structured, realistic, job-focused interviews are the gold standard according to several decades of research. So when you decide how much of the process should be unplanned or how much of the evaluation criteria should be unquantifiable, the only question you should ask yourself is how often do I want to select the wrong candidate?
Of course, sometimes the wrong candidate will slip through. That’s not totally preventable. The best interviewing methods are only mostly accurate. But that shouldn’t scare you into an intuition-based interview process any more than the occasional computer failure scares you into doing everything on paper.
Hiring wisely is perhaps the most powerful advantage a company can have over its competitors, and the best way to do it is to focus on what you can measure.Tags: job interviews
I appreciate your description of an “optimized interview process”, especially the first step, regarding full disclosure of salary ranges and benefits. That cuts down on interview stress right away.
The part about “watch them write for an hour or so” strikes me as impractical for most organizations. It sounds good, but I have never seen it done effectively. In fact, I have only seen it done a few times. The interviewer usually does not have the time and may not have the expertise to assess results.
Giving a “scripted, completely standardized assessment (test)” sounds plausible. I have seen that done effectively and more than a few times. It is still uncommon.
The traditional 90-probational hire or 30-day contract-hire is the most effective and reliable method I have seen in practice. A good candidate is obvious, and a bad candidate does not stay around long enough to do serious damage. This method is very uncommon.
The horrible truth is that once you eliminate the obvious crazies and misfits, you still cannot reliably tell who can do the job before you see them doing it. It is basically a roll of the dice at hiring time.
And some managers cannot tell who can do the job, even after they see them doing it. But that is a different problem.
The main thing that confuses me about your comment is the implication that the person performing the interview is someone who doesn’t code. Is that a thing? I’ve never seen it happen. Of course a non-coder can’t assess a coder. If HR professionals are running competency interviews for programmers, that’s gonna be an issue.
Regardless of what’s common or uncommon, it’s a competitive advantage to do things right.
Most of the jobs (including my most recent job) I’ve had were interviews where only management and HR were present, people who do not code for their work and may not have coded in 20+ years. So far I only had to code or answer any technical questions once in an interview.
I would consider them pretty great jobs though. I actually never had a job where developer skill was the problem, it’s always been poor product management, poor project management, short term thinking in general, or organizational issues (higher ups at the company didn’t understand projects/etc.). We’ve always find a way through any technical problem (within reason, we haven’t “cured cancer” yet).
In one project that I inherited there was a lot of buggy code, with very bad UX practices, rumor has it that the dev was under an extreme time crunch so I don’t know if you can say that was a “skill” problem. It is easy for you, I, and his coworkers to say “X was not very good” (and the remaining coworker did) but the reality is that an extremely complicated product was made very fast, sold and helped the customer. The user experience was hokey and pretty much a minefield of bugs, but once you set it up it was rather reliable.
I went through and improved the UX, fixed the bugs, added new features. Our product made money, and beat out our competition. We all pushed everyone we could locally to “sell” the product to the higher ups.
We could not break through the silo, because the person we needed “on our side” bought a competitor’s product that they did not understand. They were 100% sure our work was unnecessary now because everyone can just use that thing they bought. But that person was 1000 miles away from our customers, they would not listen to us that the thing they bought did not meet our customer’s needs. The high up person needed to make the case that the thing they bought “solves every problem” because that was the only thing that justified the massive amount of money they spent on it.
Now the project code collects dust, our customers went back to buying the competition (because the purchased product was not viable, as predicted), I moved on, and it’s a good thing too, because that whole team was removed soon after I left.
Re: “interviews where only management and HR were present”:
– Prepare a secure laptop for the candidate
– Prepare a (pool of) coding task(s) for the candidate. Consider explaining not just the task, but how it will be graded
– Allow ~1 hour for the candidate to code
– Get one of your expert developer to review candidate’s code. It’s a good idea to prepare a score scale in advance
– If you’re kind, explain the score to the candidate
– Set up a CI/CD pipeline running unit tests, code quality tools, etc (directly on the laptop)
– Gamify it: “Come pass our interview tests” contest. Run your developers through this first!
Requires an initial investment, and some maintenance – but it will improve interview quality and reduce your TCO. Just recruiting one “bad” developer will cover your costs!
Guess we should tell Warren Buffet to stop his unscripted interviews then. Perhaps this post itself suffers from the phenomenon that it describes.
The move towards a more pedantic structure is only going to make it clear that bureaucracy is a driving force at the company. Often this implies a leadership structure from a financial background, as opposed to a science background. Having an accountant run a tech company is a recipe for disaster.
If Warren Buffet ignores decades of research and hires based on his gut (which I don’t necessarily believe, but benefit of the doubt on that one) that’s not evidence against the research, it’s cargo culting.
Structured interviews don’t happen primarily or only at financial firms, so I’m not at all certain how that’s relevant.
Warren Buffet does not really run a financial firm though. He isn’t hiring CPA’s personally, or onboarding new futures market analysts. He is often purchasing entire companies. In that process, he needs to assess at a very high level which part, if any, of the C-suite employees should he retain. As a result, those employees need to interview for their own jobs.
During this process, he uses an in person interview which has no defined structure, as he is not simply hiring the same position at all times. To note, regardless of which industry it is, the research cited in this article is not specific to developers, so there is no reason to discount the style that Warren Buffet uses simply because he is not *always* hiring developers (he does purchase tech firms).
What Buffet is after is mentality and commitment. This actually falls firmly within the behavioral analysis that is cited in some of these studies as being critical to successfully predicting candidate strength. Just because Warren Buffet uses an unstructured interview in order to make assessments to the quality of a candidates behavior, or as one study calls it GMA (General Mental Ability), that does not mean that the unstructured interview is the issue.
What is integral to interviews actually has nothing to do with their predefined structure and everything to do with the qualities being analyzed. That certain people use different styles to get to the analysis shouldn’t be demonized, especially considering the most successful man in the industry at hiring CEO’s uses, as you unfairly call it, “speed dating”.
As an accountant, I couldn’t agree more.
I like it, but I still wouldn’t pass up the “culture” part. That part of the interview is also meant for the interviewee: here’s a couple of team members, chat it up for a bit, you will be working together for the foreseeable future if all else goes well so here’s your chance to ask questions of us, too. “Are you happy working here? How often do you work overtime or weekends?” Maybe a lot less important if you’re working remotely but if it’s a job over some expensive proprietary hardware that only exists at the office and must be shared then you’ll be seeing a lot of these people and “work culture” is a real thing.
I did mention in the article that it’s important to allow the interviewee to ask questions of their own. I suppose a bit of chatting is fine, so long as it’s not part of the evaluation criteria. People (whether they realize it or not) overwhelmingly decide their first impression of a person based on things like “looks like me,” “talks like me,” “fits my stereotype of the role,” and so on—it just gets worse from there. The idea of cultural fit is mostly a smokescreen for hidden biases.
That said, candidates may request an unsupervised conversation with a rank and file employee, someone who has no say in the hiring decision—I’ve certainly done that in order to ask the exact questions you’ve listed. And i think it’s more than reasonable to accommodate that request.
I can’t imagine you’ve had to fire someone. People get hired because they can (ostensibly) do the job – that’s what your approach (which is good, but not solely) provides. But they get fired, 9 times out of 10, for their personality. I’ve had to fire people who were good at their job; intelligent, bold, well-intentioned – but just impossible to work with (and that was their co-workers’ words, not mine).
Best engineering manager I’ve seen gets one of his engineers to take the candidate through the paces (as you’ve described) but then he looks for “the vibe”. Doesn’t matter how competent the person is, if they don’t have the vibe, they don’t get the job. As a result, he fas a great team who support each other and enjoy working together (and the rest of the company enjoys working with them too).
You’re right, a quick 5 minutes isn’t enough to see if the person CAN do the job – but it’s enough to know you don’t want to work with them, and I think that is what the CEOs you’re quoting are (mostly) meaning.
What’s the purpose of commenting here? The author of the article is just going to push back. This comment of course is going to get deleted, but hopefully the author gets the message.
The purpose of commenting is to give your (hopefully interesting) opinion on what the author is saying in their article. The author has every right to push back if they disagree with your opinion, that is what is called a discussion.
Maybe you could try posting something useful instead of playing the whiny victim.
“For security or policy reasons, some teams may not be able to offer such a highly practical interview process. In those situations, consistency is a close second to realism. If you can’t make the interview look like the job, at least make one interview look like the next—give yourself the ability to compare candidates on the same metrics by following the same script every time.”
There’s some validity to this, but the problem with it is that if every interview is conducted in exactly the same way, it gives the interviewers very little opportunity to follow-up on answers a candidate might give, or pursue oddities they notice in data about the candidate. I recently interviewed for a position where this advice was very clearly and exactly followed. In a sane interview process, the interviewers would have read my resume, noticed I didn’t spend very much time in my previous position, and asked me about it. In this process, it didn’t even come up. Of course I had a completely legit answer for the question, but my point is a sane interview process would have asked me, and this one didn’t.
As practical as possible is best. I’ve read a lot of strange theories on hiring, general advice like “we may have made a biased system, even a horrifically biased system, but we would prefer to rule some good people out than think about how somebody actually gets work done [we’ve got a whole lineup of good people ready to go, next].”
Honestly I’m quite happy when companies post blogs like that, makes it so I don’t have to waste my time applying to those places.
Big corporates have become hugely risk averse and it’s near impossible to directly hire anyone.
First you have to go through HR, second candidates are all funneled through recruiters, third there will be third parties doing expensive background checks that slow the whole process down even if you have an urgent need.
By the time the new hire finally arrives the team may have moved on or been subject to some random restructure.
Mind you my experience is in the City of London at a global bank so the craziness was “normal”.
I don’t think there’s any getting by the randomness of it. The mistake I think most organizations make is that they are so afraid to hire the wrong person (because it’s so emotionally painful to get rid of the wrong person) that they just keep interviewing and interviewing and interviewing …
If you can’t find somebody you already know can do the job, you’re just going to have to take a risk on somebody you don’t know and see how they do.
As a former HR manager of 10+years and developer of 18+years I would say both HR and software engineering have a tendency toward fads whose supporters claim all sort of benefits but which often fade out of popularity as the next fad washes in. These days candidates are pre-screened by algorithms which search resumes for keywords rather than by a skilled person. So right off the bat your are actually screening for a candidate’s ability to pack their resume with the right keywords rather than their ability to do the job. Then, as has been pointed out, the actual interview process can be anywhere on the map in terms of what the candidate is asked or asked to do. Again, you are selecting by how well a candidate fits the mold of your interviewing style. I have known several extremely talented developers who would interview poorly but perform exceptionally. To me the bottom line is this: you cannot know how well a person will perform until you give them an opportunity to show you. It is a process of trial and error.
A pretty good article overall, but with one serious factual error:
> You can’t assess the build quality of a bridge or the preparation of a meal just by looking at it, but with code there’s nothing under the surface: what you see is all there is.
50 years ago, when everything was written in assembly language, this was true. Today there’s *tons* of complexity hidden under the surface of modern-day code, to the point where it’s becoming not only a drag on competence (see Joel’s famous article “The Law of Leaky Abstractions”) but also a security concern (supply-chain attacks!)
You’re on the right track, but you didn’t take it quite far enough.
Sitting in a random place, when dealing with interview anxiety, put on a strange computer, usually in an unfamiliar environment (LINQPad? Never heard of it.), exactly how are you expecting someone to be able to code in a reasonable way? That has *nothing* to do with how you’re going to code in real life in the organisation, and I don’t care that you started with code you’d written.
I agree that the coding test solutions are rubbish, but not because they’re “generic”, but because they all try to predict using a completely alien situation.
Here’s my counter-proposal:
1) Have standardised behavioural questions, asked of all candidates. Compare who did what in their stories, what the impact was, and what lessons they learned. The intent is to eliminate bias as much as possible.
Their stories have to relate in some degree to the role you want to hire them for.
2) Have them bring their own code, and go over it in a code-review. This checks if they can deal with criticism of their code, and how well they can communicate the thought process they went through when coding it.
3) If they shine the best of all candidates after that, start their probation.
Oh, one other thing: “Probation” needs to really mean probation with a pre-designed evaluation at the end to see if they have accomplished the requirements of the role in the 1, 2 or 3 month probation period.
Engineers are brilliant (of course, I’d say that, being an engineer). However, most engineers don’t understand just how bad we are at interviewing, evaluating a candidate, or determining future success of that candidate in an organisation.
The candidate might be brilliant on paper, but not be able to speak at the interview. They might *ace* the interview, but fight like cats and dogs with their manager. They could be fantastic as programming by themselves, but be toxic in a team setting. Your method simply isn’t going to discover any of that.
Quit wasting time, get people in a chair in the office, and see how it works out. Be willing to say “You’re not for us, or to hear you’re not for the candidate.”
Exactly my sentiments. That approach is just as fraught with issues as the others. It is slightly better than whiteboard coding in some ways, but much worse in some others. Throw me into an unfamiliar IDE and only give me an hour? I’m probably going to struggle. Throw another candidate in with Vim and Bash, where I would shine, and they will struggle — and heck, without my Vim configuration, I’m going to stumble a lot trying to use my custom key bindings.
Moreover, it filters out candidates I don’t want to filter out. I don’t care if a candidate can write a LINQ and C# in Visual Studio off-hand if they have 10 years of C++ expertise on Linux. I’m going to assume they can figure that part out, because *coding* in our immediate environment isn’t what makes them a good hire. In fact, if they couldn’t pivot into a new environment and execute, I’d consider them a bad hire! But that means I can’t use this sort of test unless I let them choose the parameters and/or go with a whiteboard test and accept a little pseudocode.
Frankly, I think the vast majority of coding test interviews are equally broken. I think people are just *generally* bad at interviewing, including me a lot of the time, and I’m considered a good interviewer for software candidates. No, I can’t figure out if someone is a crack coder or writes sloppy code in a chat-style interview. But what I can figure out is whether or not they actually contributed to any of the projects on their resume. If someone is technical enough and a good enough liar to convince someone with an MS and 15 years of experience that they know what the heck they are talking about when it comes to the technical aspects of their last project, then they’re either fairly technical or a hell of an actor, and odds are on the former. At some point, we just have to let a little trust into the process. I can also find out if they blame everyone but themselves about their past failures, or if they are an insufferable jerk.
As someone with actual ability, I can also suss out over-claiming of ability level. If you claim to be an expert in Java, C++, or Python, I’m going to find out if you really are or just think you are. For example, if you say you are an **expert** in Java, you had better be able to answer something as basic as why you can overload foo(int) and foo(double), but not foo(List) and foo(List) (the answer is that Java generics are implemented via type erasure — I learned this within 6 weeks of my first real job using Java). You had better be aware of RAII and some things that were added in modern C++ (11/14/etc.) or the Python GIL. These are not trivia like the “Diamond of Death” multiple inheritance problem, they are central ideas to the languages.
I’m not trying to hire the “perfect” candidate. I’m just trying to filter out the definite bad-fit candidates, so we can move on to the probationary period. I’m also trying to give them an opportunity to find out if *we* are a good fit for what they want to do. I probably spend more time talking about us than grilling the candidate on anything, and I often start out by letting them ask as many questions as they want. This is why Isaac’s example of the best process he ever experienced is actually the best bit of this article — it was basically a super-short probation period for *both* sides! That’s really the only way to know for sure, and it doesn’t filter on the wrong things.
I’ll give you yet another reason that both styles are bad: entry level candidates. Hiring entry level is the absolute hardest. They have no professional experiences to talk about, except maybe internships in some cases. So all I can try to gauge is whether they absorbed anything in school, i.e. ask some basic theory questions and see if they can carry on a conversation about it. I’m not looking for perfection, just recognition of the ideas. I often ask them what their favorite class in the field was, then ask them some trivia in that bit of the field that is central enough to be taught in every college course on the subject. You picked language theory as your favorite because you just passed it with an ‘A’? Well, that’s on you, man; I hope you really did absorb that stuff about automatons and state machines — e.g. why can’t regular expressions deal with matching parentheses? If not, you learned something in the interview, like not claiming language theory as your favorite subject if it’s not.
On the other hand, if I’m trying to hire for a very specific need, the right candidate will stick out like a neon sign. Hopefully, our job posting was specific enough to attract that FORTRAN 77 expert to program the Cray supercomputer or the .NET technical lead to take over a 15-year-old legacy C# project and port it to Windows for ARM. If the perfect candidate for that doesn’t surface, and we don’t have someone internally to grow into it, we’re still going to be looking for someone with an awfully impressive resume to start with. It costs a lot more to not have anyone in the role than to bring the wrong person in for a couple months to find out they aren’t the one.
The irony that my Double and Int types were erased from the foo(List) example is so apropos! I guess the SO blog comments don’t like angle brackets.
I recommend you to read the Daniel Kahanman book “Think fast and slow” , he deals with the selection human resources problem in the army and companies.
He found some effectives but not perfect ways to select people , he tells very interesting things in his book and you can extrapolate to recruit engineers and programmers
“On the other side of the table, by far the best interview process I’ve ever gone through was with a startup who put me on a one-week contract (an “audition” as some companies call it) to work with their team.”
The company needs to take the risk when hiring. Apart from some basic personal interviews (more psychological) I wouldn’t use those rigid assessment technical tests and questions. Put the dev to code and let the team mates decide.
For a candidate with less than 5 or 6 years of experience, I guess this makes sense. But for someone with more than 10 years of experience with a respectable record of working for reputable companies, I don’t think a coding assessment is warranted. I prefer discussing architectural preferences and also checking to see if the candidate has kept skills up to date by checking when they adopted or researched various technologies/trends. Samples of recent work can help too.
Given the title, this article was pretty light on science.
Big Tech companies actually have science teams running experiments and optimizing their interview process. When hiring at such scale stakes are high so they try to improve. Granted, the process is still very conservative and improvements are incremental, so they get stuck in a local optimum. It’s hard to convince leadership to radically overhaul the hiring process and try something new.
Talking of hiring at scale, the “trial interview” format sounds great, but it’s impractical for anything other than a startup. If 1,600 people apply for an opening in my team, how do I select the one who can come and try? congrats, you are back at step 1.
The “give candidates an actual piece of your code” is also given as advice often. I tried that, but usually, you need to make several passes to make the code workable. Once you remove tech-specific and domain-specific details, all the details needed from other parts of the system, etc. you end up with a generic coding exercise not that far from a leetcode problem.
After +400 developer interviews, my favorite way to judge candidates is by asking a “clean code” question. This doesn’t refer to Uncle’s Bob Clean Code, but to code that is simple, maintainable, and extensible. I give candidates a simple and slightly ambiguous problem statement, usually revolving around “write a library that does X”. I expect candidates to ask questions and clarify the ambiguities, then proceed to define the APIs and finally write the code. The implementation is straightforward, with no tricks or logical puzzles. Only use simple structures such as lists, hashmaps, and loops. Then I ask one or two follow-up questions for more requirements, such that they need to modify or extend their code. Depending on how this is organized this might be trivial or very complicated.
I feel this format is the closest to on-the-job work and gives me a good feeling of what it would be like to work with these people. Also has a lot of freedom and allows one to peek inside the candidate’s mindset. How do they deal with ambiguity? How do they approach API design? how do they handle incorrect values? Do they care about corner cases? It is also mostly devoid of what developers hate most e.g. trick questions and obscure algorithms.