Interview questions tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role at a top company, and interviewers typically come from a mix of larger companies like Google, Facebook, and Twitter, as well as engineering-focused startups like Asana, Mattermark, KeepSafe, and more.
In this blog’s heroic maiden voyage, we’ll be tackling people’s surprising inability to gauge their own interview performance and the very real implications this finding has for hiring.
As you can see, in addition to one direct yes/no question, we also ask about a few different aspects of interview performance using a 1-4 scale.
You might notice right away that there is a little bit of disparity, but things get interesting when you plot perceived vs. actual performance for each interview.
You can hover over each square to see the exact interview count.
Gayle Laakmann McDowell of Cracking the Coding Interview fame has written quite a bit about how bad people are at gauging their own interview performance, and it’s something that I had noticed anecdotally when I was doing recruiting, so it was nice to see some empirical data on that front.
As you recall, during the feedback step that happens after each interview, we ask interviewees if they’d want to work with their interviewer.