This is a guest posting by Justin Rohrman. Justin has been a professional software tester in various capacities since 2005. In his current role, Justin is a consulting software tester and writer working with Excelon Development. Outside of work, he is currently serving on the Association For Software Testing Board of Directors as President helping to facilitate and develop various projects.
I was in a daily huddle for the testing group and the person who spoke before me said they were about to do some regression testing. We were in the middle of a sprint and they had found a bug in a new feature. They reported the bug and talked with the developer. A couple of days later, there was a new build with a fix.
When my colleague said regression testing, a few people spoke up. We weren’t at the end of the release cycle and feature development was still happening. Any time spent doing regression testing would just have to be done again.
Right there, in the middle of a perfectly normal status meeting, we discovered that we were using the same words but meaning different things. Before that, we were talking past each other and just didn’t realize it.
There is no one single dictionary in testing. So we constantly find ourselves debating the meaning of words. I want to talk about my ideas on common testing terms and why they do and don’t matter at the same time.
Get TestRail FREE for 30 days!
Common Testing Terms
Regression testing might be one of the most used phrases in software testing. If you have worked through at least one release cycle, or sat through a strategy meeting, you have probably heard someone utter the phrase and insist the related activity needs to be performed faster, shorter, or maybe not at all. We have to talk about what we mean by ‘regression testing’ before we even have that conversation.
On another project, we were supposed to be working on a smoke test.
At the time we had unit tests that ran with each build. If the unit tests all passed, and we had a green bar in the Continuous Integration dashboard, the build was automatically installed on a special server along with a database upgrade and refresh. A suite of automated User Interface tests ran in this environment each time a new build was installed, acting as a sort of nightly regression suite. If software changes were made that day, and a problem manifested in an area our UI tests covered, the tests would fail and we would be notified via email.
Add in some testing done by people during the sprint, and some people working with our software through their web browsers, and those were our release testing layers.
Our manager wanted an additional line of defense; a smoke test. I interpreted this as a set of tests to catch any last minute problems in a release candidate. For our first try, we spent about an hour going through our existing set of test cases. We picked a set that we thought represented important experiments. The resulting set of tests covered a combination of areas that broke frequently, parts of the product with complicated setup procedures or usage, some new functionality, and some things that we thought customers absolutely had to have to use the product.
That wasn’t quite what our manager had in mind. Unfortunately, our manager didn’t know what they had in mind either. But we didn’t know it at that point.
Next we had a series of conversations with other testers, with our manager, and with a couple of product managers. Each group of people had competing interests, and the real smoke test was at the intersection of all their desires. Our product managers wanted something that would represent the most important functionality in our product for a handful of highly profitable customers. These were scenarios that must work for our customers to be able to do their job. Within the test group, we wanted coverage that would find important bugs. These were severe failures that would reduce the value of our software for large groups of people. That might be a browser crash, or a data persistence problem, or just an image overlapping a submit button on a webpage.
What our manager really wanted, was to avoid the question of “why wasn’t this tested?” and also have the smoke test performed quickly, using as few people for as little time as we could get away with.
What we ended up with was a set of 10 or so scenarios built as automated user interface tests. These ran against candidate builds toward the end of each release and took about 15 minutes to finish.
We had another, similar scenario, trying to figure out what regression testing meant for us. It started because we had the ubiquitous comment from management that regression testing was taking too long. They were right. For the past several releases, we would gather up all of the test cases from previous releases, add them to the documentation for the current release, and run everything. Testing took longer every single release. We ended up redefining what regression testing meant for our group, and what the strategy looked like as a result.
Testing is a young field; we haven’t existed long enough for there to be widely accepted definitions for words like there are in other branches of technology. Luckily, there are some tricks we can use to get to the point much faster.
Tools for understanding
George Bernard Shaw said that the single biggest problem in communication is the illusion that it has taken place. We want to get past illusions. There are a couple of ways I see to get to understanding faster: culture, and explanation through example.
Culture is what develops when people spend time together working and talking. Most companies have a cadence or pattern for releasing software. At one company we released software every two weeks. One was a new feature release. They were trying to be agile, but releases really took the shape of a compressed waterfall. The first week of a release was development, and the second week was for testing, reporting, and bug fix cycles. The next release was a bug fix release and changes came as a flow when they were ready. There was no Visio diagram outlining our weird development workflow. Yet each team member knew what happened, when those things happened, and had particular names for the activities because they had all worked together like that for months or years.
It took time for me to assimilate when I started at this company. They used words that I was unfamiliar with casually and without explanation. Each time they said a new word, I had to ask them to stop and tell me what they meant. This made for slow conversations, and sometimes they had a hard time explaining and would have to demonstrate in the software or walk me through a process as it was happening. This is great when you are a member of that culture, because a single word said in a second replaces a paragraph said in a minute. Repeat that a dozen times a day per person, and you save a great deal of time. For new people, it can be jarring, slow things down, and cause information loss when the terms are explained poorly.
The other option, and one I use often, is to use an explanation. For example, if I were talking with someone about regression testing I might say “We need to perform some regression testing, and what I mean by that is testing that will expose problems introduced by this change in areas that were previously expected to be working fine.” The person I’m talking with might not agree with that definition, and that’s fine with me. At that point though, they at least have some idea of what I mean. So, they can either tell me I’m wrong and explain what they mean when they use the phrase ‘regression testing’. Or, they can disagree on the terms, while still understanding what needs to be done.
Get to understanding faster
My personal preference is a combination of explanation and demonstration, and developing an internal culture. These two things feed into each other. Explaining what you mean, especially through example, reinforces the idea each time people hear it. Over time, the group you work with will have a better understanding of what you mean by a particular word such as regression testing, smoke testing, bug, or quality. And over time their definitions will sink in for you as well. Eventually, language will normalize and explanations get shorter unless there is someone new in the room.
More importantly though, don’t assume people understand what you mean with industry jargon, and don’t be afraid to explain what you mean by it.
- 10 Must-Read Books You Didn’t Know Were About Software Testing
- TestRail JIRA 7.0 Test Management Add-On & Updates
- Gurock & TestRail Acquired by IDERA
- TestRail Highlight: 7 Unique Productivity Features to Supercharge Your Software Testing