People often suggest that interviewing.io should create a certification that our users can post on their LinkedIn profile, e.g., something like “Top 10% performer on interviewing.io”. Presumably, these certifications would signal to recruiters that this person is a good engineer and worth reaching out to and should carry more signal than where said person went to school or worked previously.
I think certifications are a terrible idea, and I’ve resisted building them. Simply put, the incentives they create for engineers and recruiters are all wrong. To explain what I mean, let’s split engineers into two distinct personas:
The engineer who looks good on paper. This person will simply not list the certification on their profile – they have no reason to! They’re already getting contacted by at least 10 recruiters a day. Chances are, they’re actively ignoring LinkedIn unless they’ve decided that they’re looking and want to respond to a handful of the hundreds of InMails they’ve gotten already to give themselves more optionality during their job search.
The engineer who doesn’t look good on paper. This person will likely add a certification to their profile. It’s rational – anything that helps them stand out and legitimize their profile, in the absence of traditional pedigree, is a win. That sounds great, right? Unfortunately, here’s the rub. Unless your certification is respected by recruiters and well-established, they will not take it seriously… because it runs counter to everything they’ve been tuned to look for. If your profile doesn’t have a reputable school or a top company on it, then an unknown certification won’t save you, and over time, listing it will do harm. Because the certification will almost always go hand in hand with lack of pedigree (as you saw above, people who look good on paper will have no reason to list it on their profiles), recruiters will start to develop negative associations with it. Because most recruiters will not engage with these candidates for long enough to find out if they’re good or not, this will happen even if the certification actually carries a positive signal. In other words, recruiters will learn, over time, to associate the presence of the certification with a “no-go” because it will only be on profiles that have previously been trained not to reach out to.
That’s the theory of why certifications are bad. They’re bad for the individuals listing them, and they’re bad for the industry as a whole because, ironically, they make it harder to find good candidates. But what happens when you look at the data?
Engineers use interviewing.io for anonymous mock interviews. If things go well, they skip right to the technical interview at real companies (which is also fully anonymous). We started interviewing.io because resumes suck and because we believe that anyone, regardless of how they look on paper, should have the opportunity to prove their mettle.
At this point, we’ve hosted over 100k technical interviews, split between mocks and real ones.
Regardless of the interview type, when an interviewer and an interviewee match on our platform, they join a collaborative coding environment with voice, text chat, and a whiteboard, and jump right into a technical interview. After each interview, both parties leave feedback, and once they’ve both submitted, each one can see what the other person said and how they were rated.
Here’s what the feedback form that interviewers fill out looks like:
In this post, we aggregated scores from these interviews for each interviewee and then cross-referenced how many certifications they listed on their LinkedIn profiles.
You might say that an engineer’s performance in interviews on our platform isn’t the canonical source of truth for their engineering ability, and you’d certainly be right. In the absence of holistic performance review data about our users, which is pretty much impossible to get, we decided running this study was still worthwhile. For what it’s worth, I have pretty high conviction that performance in interviews on our platform correlates very strongly with performance in interviews in the real world – our candidates tend to pass real interviews 3X better than candidates from other sources. Fully closing that loop with on-the-job performance data is the holy grail of any recruiting enterprise, and I hope we can do it one day.
Caveats aside, before doing the analysis, our hypothesis was that having one or more certifications on your profile would have a strong negative correlation with your interview performance. Why?
We’ve seen in the past that though having attended a top school doesn’t correlate with interview performance, having worked at a top employer does. We saw above that people who look good on paper aren’t going to be incentivized to put certifications on their profile, which means that we’ve just cut a lot of top performers from the pool.
This leaves people who don’t look good on paper. Because top performers aren’t evenly distributed between pedigreed and unpedigreed candidates, there will be fewer top performers in this pool. But even if they were, by definition, only a small part of this population will be top performers. Therefore, most of the people in this pool will not be top performers, which means that most of the people who are incentivized to list certifications on their profile will not be top performers.
Basically, people who look good on paper are deincentivized to list certifications. People who don’t look good on paper are incentivized to list them, but most of them are not top performers. Therefore, most of the people listing certifications are not top performers.
If we’re right, not only is creating certifications not useful, but doing so will also have the unfortunate side effect of making recruiters dig in their heels about the importance of pedigree, will, over time, reduce the marginal utility of any new certifications and will ultimately harm attempts at making technical recruiting more fair or meritocratic.
In this analysis, we took a list of our users for whom we had interview data and, where possible, analyzed their LinkedIn profiles. We ended up analyzing about 20K LinkedIn profiles, 28% of which had some kind of certification. We then pulled out the top 10 most frequent certification authorities, so we could break them out and do some more granular analysis. These were (in order of frequency, i.e. Coursera had the most hits):
Because people usually do multiple interviews on our platform, we ended up with about 40k observations (i.e., interviews) in each regression.
Our first result is that people with certifications do worse in interviews, as shown in the bar chart below. People with certifications on their LinkedIn profiles pass interviews on our platform about 53% of the time versus 57% of the time for people without certifications, a very statistically significant difference (p < 0.00001). Remember that these interviews are completely anonymous. The interviewer isn’t basing their ratings on the person’s LinkedIn—just their interview performance.
Notably, the certification “penalty” is equally large whether people had the certifications up in 2021 or 2023. So there’s no sign that today’s depressed labor market changed the nature of the signal.
How much of this difference is due to the fact that people with certifications do different kinds of interviews? We next adjusted for the language (e.g., Python, Java) and focus (e.g., frontend, machine learning) of the interview and only compare Java coders to Java coders, Python to Python, etc. This control ensures that the results are not driven by broad patterns at the group level, asking whether certifications are predictive of performance compared only to candidates coding in the same language. If anything, the “Interview language & focus controls” bar shows that this makes certifications look slightly worse. When you compare certified people to non-certified people within the narrow types of interviews they typically choose, they lag just a bit further behind.
Next, we wanted to know how much of this difference is explainable by the attributes of the person. For example, people who seek out certifications may have a non-quantitative background. Perhaps they majored in communications rather than computer science. Or they are a paralegal trying to switch career paths. Indeed, we observe this pattern in the data: people with non-traditional backgrounds are about 30% more likely to have a certification.
To account for this selection, we just compared people with similar backgrounds. For instance, does a Harvard graduate with a certification do better or worse than a Harvard graduate without one? This correction shrinks the gap by about 40 percent (see the “Pedigree controls” bar).
Note: All differences between the certified and uncertified users are statistically significant (p < .01 or smaller).
So, LinkedIn certifications are indeed a negative tag for candidates on our platform. This isn’t explained by the kinds of interviews they do. But we can show that part of it is due to the fact that certified people tend to have non-traditional backgrounds. The remainder of the gap is probably due to similar dynamics: you get certified if you have something to prove.
This analysis treats certifications as binary. Either you have it or you don’t. But there are a range of authorities out there that give certifications: are any of them a positive tag?
We did similar regression analysis for the top ten certifiers in the data. The results are below:
The one standout is Triplebyte. Their graduates are 6 percentage points more likely to pass interviews —a serious boost, albeit not enough to dispel the negative signal that all the others carry. The worst is the Cisco badge, with a 10 percentage point drop in performance.
When we dug into the data, we saw that people from non-traditional backgrounds do indeed list certifications on their LinkedIn profile more often than their well-pedigreed counterparts. People whose most recent schooling is a web development certificate or associate's degree score are about 30% more likely to display a certification.
We also saw that, generally speaking, certifications carry a negative signal and that these results hold up even in the increasingly employer-favorable 2023 job market (in other words, good candidates haven’t suddenly started listing certifications on their profiles to get noticed).
As we expected, these realities create an unfortunate feedback loop. Recruiters tend to value pedigree above all else, which means they’re less likely to talk to non-traditional candidates. When they see non-traditional candidate profiles with certifications, because they weren’t going to talk to them anyway, over time they’ll develop a negative association with those certifications.
Furthermore, given that people who list certifications are more likely to perform worse in interviews, when they choose pedigreed candidates who have a certification and those candidates perform worse, that negative association will be strengthened.
Because of these mechanics, certifications get reinforced as bad in recruiters’ minds, and listing them on your profile ends up being a counterproductive strategy for diamonds in the rough, the very candidates whom certifications were supposed to help in the first place.
Interview prep and job hunting are chaos and pain. We can help. Really.