The worst interview I ever had lasted 8 hours. It started with a morning quiz on development methods and technology stacks. After that I went to a lunch with 15 strangers and was asked what I do outside of work for fun, and why I wanted to work there. The second half of the day was spent testing their software while my interviewers seemed to be working on their own projects. My last interaction that day was a hour long introduction to their API and some questioning about how I might test it. After the full day, I received a “thanks, but no thanks.” email with no feedback on their decision.
I think interviews should be better than that, and can be much better. I want to talk a little about the failings of average interviews for software testers, how I like to interview, and offer a little strategy to help put each candidate in the best light.
The Typical Tester & QA Interview
Not very many people study software testing (especially not deeply), so interviews tend to go after the same superficial topics. Nearly every interview I have been to started with a series of questions about words – what is regression testing, what is the difference between bus severity and priority, what is a smoke test, what is black box or white box testing. Quizzing a candidate on these surface definitions is damaging. There are no standard definitions in software testing, not really, which means the interview becomes a guessing game or a false-rejection/false-acceptance game. Answering those questions in a way that doesn’t align with what the interviewer thinks has got me into trouble more than once.
When I have to answer those questions now, I usually say something like “I have heard people use that term in different ways, but when I say it I mean … What do you mean when you use the term?” This changes the conversation from a quiz, to an effort to understand each other and come to some sort of shared understanding.
The other pattern I see is interviewers focusing on things that don’t give much insight about how the person they are talking with will fit in a testing role. Several years ago, I interviewed at a company for the role of a senior tester (also see the related article Do We Still Need Dedicated Testers?). The company was trying to transition to agile, and most of the questions they asked me where themed around agile development and process. We spent about an hour talking about different scenarios – what would I do if I got push-back from developers on a bug, what was my view of testers in an agile context, when did I think testers should first be involved in a new feature.
Those scenario questions are certainly better than the definition game, and might be an interesting aside in a technical interview. They might be a good litmus test for culture fit/context fit. They can also be a distraction from discovering how good someone is at testing software. I think of these types of questions as interesting starting points, not the real meat of an interview. I want to go deeper and see how people actually test real software.
Using Testing Challenges
My favorite way to interview testers is to have them test software. That might be whatever product I am working on right now, but that can be difficult because there is a lot of background and context an outsider will be missing. Simple challenges, something like the palindrome test challenge I co-wrote with Paul Harju and Matt Heusser, tend to work well for interviews. I like to pose this as an open-ended challenge, and start the exercise by saying “test this!”. You can characterize the skill of a tester by how they respond to the challenge.
A junior tester might enter a few values at boundaries – something that is definitely a palindrome, something that is definitely not a palindrome, and maybe a couple of strings that have special and Unicode characters. They might find a bug in the webpage or offer some design advice to help make it more usable. When you ask if the product is ready to ship, they’ll give you a confident yes or no.
A senior tester should take the exercise meta and start asking deeper questions that look past the web browser. How much time do I have to test? Who is the customer of this product and what do they value? What are the development team and product manager concerned about? Are there any lingering aspects of this product that are still in development? This line of questioning shows that they are capable of framing and directing their work.
Problem zero in testing is always the question of what I should be working on at this minute, and these answers will help them zero in on that. If you ask this person whether we should ship this version of the product you might get some hesitation. Rather than a confident yes or no, they will probably share their feelings on the product along with the problems they discovered, and ask more questions about what the customer needs. A senior tester usually doesn’t want to be the gatekeeper, but they can help make the team fill out missing information around the release decision.
Interview Style Bias
I have interviewed people that did great during the interview, and then either did not fit in or struggled with the work when it came to the job. I have also interviewed other people who struggled during the interview, they couldn’t answer questions and were incomplete with others, but ended up being the backbone of a team after they were hired.
The most popular style of interviewing, the round-table where several people from the company sit around one person interviewing for the job, is also the worst for often introverted software people. Introverted people need time and space to think, and this style can often turn into rapid fire questioning or coding on a white board as if it were a performance art. Introverts that are fully capable of doing the work might get a less than shining review because there are too many people in the room asking too many questions in a short period of time.
Another pattern I have noticed in the past few years is the take home challenge. Companies create some sort of testing or coding challenge for a person to to before the interview. The results are reviewed in person when the person gets to the interview. Companies that use this often end up with a staff of young people, probably recently graduated from university. Older people with responsibilities during the weekends and evenings such as caring for family and children, will see this as a barrier to entry. They will either not apply to the position, or attempt the challenge and not do well. The people that do well on these challenges sacrificed time to do so.
Each style of interviewing is biased to shine a favorable light on specific types of candidates. Creating a custom interview for every candidate might not be the solution. That would be a time consuming venture and make assessment very difficult. My solution is to be kind. If a candidate is struggling in a group, maybe have a couple of interviewers leave and come back later. If they aren’t doing well with a white board testing challenge, maybe try pairing on real software.
Learning how to design better interviews and see past the bias will help create longer lasting teams (also see the related article 3 Critical Factors To Retain Your Best Software Testers).
Interviewing In Practice
Interviewing testers is challenging work, we don’t make tangible things that can easily be judged. I like a strategy that includes questions designed to see how the tester thinks, exercises that shows how well they can test actual software, and a contingency plan based around personality types to help everyone show what they do best. Not everyone should be a tester, but testers can come from anywhere. This strategy can help find testers in the rough.
This is a guest posting by Justin Rohrman. Justin has been a professional software tester in various capacities since 2005. In his current role, Justin is a consulting software tester and writer working with Excelon Development. Outside of work, he is currently serving on the Association For Software Testing Board of Directors as President helping to facilitate and develop various projects.
- Announcing TestRail 5.2 – New Unique Screenshot, Case Template and Exploratory Testing Support
- Lean QA and Agile Testing Talk with TestRail
- TestRail Highlight: Test Management Project History and Test Case Versioning
- Staged Releases for Better SaaS and On-Premise Deployments