Software Testers Diary: Follow Your Nose, it Always Knows

This is a guest posting by Carol Brands. Carol is a Software Tester at DNV GL Software. Originally from New Orleans, she is now based in Oregon and has lived there for about 13 years. Carol is also a volunteer at the Association for Software Testing.

Reviewing completed user stories before they get put in the test queue is a pretty normal part of my day. We look at the user story together with the completed work. I ask questions, some of these questions reveal missed acceptance criteria or plain old defects. The problems get fixed, and they get put into the Ready to Test list. Today I had a review that was a little different.

I was called over to the developer’s desk, and we looked at the story he was working on. This story was unfamiliar to me. It had been recently requested by a stakeholder based on feedback from beta testing that came while I was out of the office. The stakeholder asked for a way to map a date from an outside source so the age of a piece of equipment can be calculated without typing in the date. We already had a screen where statistical information about the lifetime of equipment is handled, so it seemed natural to enter the ‘birthday’ for the equipment on that same screen.

When they showed me the newly updated screen, I knew something was off, but it was hard to put my finger on the problem. This is not a totally uncommon feeling for me. I worry whenever we react quickly to customer requests by making a change. These make me a little nervous until I’ve had a chance to absorb all the necessary or available information and clear the fear of the unknown from my head.

Get TestRail FREE for 30 days!

TRY TESTRAIL TODAY

Trust Your Feelings

Follow your Nose Instinct

During this review, I had an uncomfortable feeling that I just couldn’t shake. As they explained the request, showed me the acceptance criteria they created, and demonstrated how they met that criteria, I felt like something was ‘wrong’. I just couldn’t point to an example that would make the problem clear. So, I started asking questions.

When the mapped data is updated, do we expect the age to update as well? Yes, we believe that will happen. OK, I can add that to my list of claims to test. The statistical data that was previously on this screen was required before other workflows could be completed.

Is the age mapping required as well? No, it should not be required. Another claim to test. To relieve my concerns, they mention that other data will be added to this screen as well. I ask, do you mean the work that Betsy is doing right now? Yes, and as a matter of fact she hasn’t seen this work or request yet, maybe we better call her over!

Betsy is a little surprised to see what’s been done on this screen, and we don’t think it will interfere with her work. But, some of her work intersects with the changes on this screen. We got that figured out, but something is still bothering me. Her merge will probably be affected. That was a nice find, but I still can’t escape the feeling that there’s a breaking change on this screen.

Keep Asking Questions

Folllow your Nose Questions

One thing I’ve learned is, if you can’t put your finger on something, it’s because you’re worried about what you don’t know. The solution is almost always to keep asking questions, even if it feels like you’ve covered everything. I go back to one of the answers the developers gave me: Entering statistical information on this screen is required, but entering a data mapping is not. Right now, the date is unmapped but the statistical information is entered. I ask to look at the main screen, the one we see before we enter lifetime information.

Aha! On this screen, if the lifetime statistics have been entered, we’re usually able to start editing, adding new fields, etc. Right now, the add buttons are hidden. That’s what used to happen when statistical information wasn’t entered yet.

So, it looks like we are treating the date mapping as a requirement before we can start editing this screen. That’s a potential problem, because we don’t expect customers to have loaded everything they need to map dates before they get to this screen. Based on the conversation we’re having, it sounds like the product owner didn’t examine whether we should require date mapping and how it intersects with the requirement for statistical information.

Receive Popular Monthly Testing & QA Articles

Join 34,000 subscribers and receive carefully researched and popular article on software testing and QA. Top resources on becoming a better tester, learning new tools and building a team.




We will never share your email. 1-click unsubscribes.
articles

Don’t Give Up

Follow your Nose Thumbs up

There were a few minutes during the conversation where I almost gave up. I thought, well, they’ve been working on this while I was away. I’m sure they have it all figured out. I probably don’t understand the change. After we went over the acceptance criteria, the developers assumed we were done. I could have given up then, and just let it come to me for testing and figured it out that way. But if I’d done that, Betsy’s merge would have been a surprise. I might not have gotten to this story right away, and a beta tester would probably have found the problem before I did. And there’s a chance we would have been pressed for time and forced to use a quick solution instead of a good solution.

I’m glad I’ve learned to trust my instincts enough to continue pressing with questions. Rarely does a few minutes of extra questioning result in wasted time. It almost always gives me a better understanding of what I need to do. And once in a while, it pays off bigtime.

In This Article:

Sign up for our newsletter

Share this article

Other Blogs

General, Agile, Software Quality

How to Identify, Fix, and Prevent Flaky Tests

In the dynamic world of software testing, flaky tests are like unwelcome ghosts in the machine—appearing and disappearing unpredictably and undermining the reliability of your testing suite.  Flaky tests are inconsistent—passing at times and failin...

Software Quality

Test Planning: A Comprehensive Guide for Success

A comprehensive test plan is the cornerstone of successful software testing, serving as a strategic document guiding the testing team throughout the Software Development Life Cycle (SDLC). A test plan document is a record of the test planning process that d...

Software Quality, Business

Managing Distributed QA Teams

In today’s landscape of work, organizations everywhere are not just accepting remote and hybrid teams—they’re fully embracing them. So what does that mean for your QA team? While QA lends itself well to a distributed work environment, there ar...