At interviewing.io, we’ve coached hundreds of people through salary negotiation. We’re good at it — our average user gets $50k more in cash, and we have a 94% success rate.
Having done this a lot, we’ve seen our users make the same two mistakes, over and over, BEFORE they start working with us. These mistakes are costly and make it harder for us to do our jobs. Both involve how you talk to recruiters at the start of your job search, way before there’s an offer. Even if you never use our service, you should absolutely avoid these mistakes.
My name is Kevin. I am not and have never been a software engineer. I have never written or tested a single line of code, and I have never even worked as a PM. Despite that, I was able to pass a Google system design interview.
I had just finished working on a system design interview guide and learned a LOT from doing that, but I also learned a few tricks along the way. If these tricks helped me pass, then imagine what you’ll be able to do with them.
The recent exciting and somewhat horrifying inflection point in AI capability tipped me into writing this blog post.
**I simply don't believe that AI can do hiring. **My argument isn't about bias (though bias is a real problem) or that it's technologically impossible. It's just that the training data simply isn't available.
Most people believe that if you can somehow combine what's available on LinkedIn, GitHub, and the social graph (who follows whom on Twitter etc.), you'll be able to find the good engineers who are actively looking and also figure out what they want. This is wrong. None of those 3 sources are actually useful. At the end of the day, you can’t use AI for hiring if you don’t have the data. And if you have the data, then you don’t strictly need AI.
If you’re a software engineer who’s on the market, should you list yourself as OpenToWork? Does doing so carry a negative signal? And with the recent deluge of layoffs at tech companies, has the meaning of OpenToWork changed?
TL;DR It used to be bad. Now, it's not. Moreover, it's clear to us that the people who were laid off in the 2nd half of 2022 and 2023 so far are great... and that, by and large, these layoffs were indeed NOT based on performance.
People often suggest that interviewing.io should create a certification that our users can post on their LinkedIn profile, e.g., something like “Top 10% performer on interviewing.io”. Presumably, these certifications would signal to recruiters that this person is a good engineer and worth reaching out to and should carry more signal than where said person went to school or worked previously.
I've always thought certifications were a terrible idea, and I’ve resisted building them. Now, we've finally dug into the data to see if my hatred of them holds water. TL;DR it does.
interviewing.io is an anonymous mock interview platform and eng hiring marketplace. We make money in two ways: engineers pay us for mock interviews, and employers pay us for access to the best performers. This means that we live and die by the quality of our interviewers in a way that no single employer does – if we don’t have really well-calibrated interviewers, who also create great candidate experience, we don’t get paid.
In a recent post, we shared how, over time, we came up with two metrics that, together, tell a complete and compelling story about interviewer quality: the candidate experience metric and the calibration metric. In this post, we’ll talk about how to apply our learnings about interviewer quality to your own process. We’ve made a bunch of mistakes so you don’t have to! It boils down to choosing the right people, tracking those 2 metrics diligently, rewarding good behavior, and committing to providing feedback to your candidates.
Hi, I’m Lior. I spent close to five years at Meta as a software engineer and engineering manager. During my time there I conducted more than 150 behavioral interviews. In this post, I’ll be sharing what Meta looks for in a behavioral interview, and how we evaluated candidates.
Giving feedback will not only make candidates you want today more likely to join your team, but it’s also crucial to hiring the candidates you might want down the road. Technical interview outcomes are erratic, and according to our data, only about 25% of candidates perform consistently from interview to interview.
I recently ran a Twitter poll asking my followers to estimate how many engineers had been laid off from US-based startups and tech companies in 2022 and 2023 so far. Most people overestimated the number by an order of magnitude. Here's what we did to get to the actual number.
interviewing.io is an anonymous mock interview platform and eng hiring marketplace. Engineers use us for mock interviews, and we use the data from those interviews to surface top performers, in a much fairer and more predictive way than a resume. If you’re a top performer on interviewing.io, we fast-track you at the world’s best companies.
We make money in two ways: engineers pay us for mock interviews, and employers pay us for access to the best performers. To keep our engineer customers happy, we have to make sure that our interviewers deliver value to them by conducting realistic mock interviews and giving useful, actionable feedback afterwards. To keep our employer customers happy, we have to make sure that the engineers we send them are way better than the ones they’re getting without us. Otherwise, it’s just not worth it for them.
This means that we live and die by the quality of our interviewers, in a way that no single employer does, no matter how much they say they care about people analytics or interviewer metrics or training. If we don’t have really well-calibrated interviewers, who also create great candidate experience, we don’t get paid.
In this post, we’ll explain exactly how we compute and use these metrics to get the best work out of our interviewers.
Interview prep and job hunting are chaos and pain. We can help. Really.