Tester’s Diary: Training Up and Reinforcing Practices

This is a guest posting by Carol Brands.

Does adding a new team member speed things up or slow things down?

I think most people recognize that it does both. Initially, it slows things down: The new team member needs to be trained, and that takes time away from the usual daily activities of whoever is responsible for the training. But as time goes on and the new team member becomes independent, their contributions help the team be more productive in the same amount of time.

At least, that’s what I expected to happen when we added an intern to our team. Instead, I learned that you can start seeing benefits right away.

Get TestRail FREE for 30 days!

TRY TESTRAIL TODAY

Onboarding at a Busy Time

Adding a new team member, Software testing internship, Tester intern, Training new tester, Training testing intern, Changing team dynamics, Boosting test team productivity. TestRail.

I found out our new intern, Trisha, would be joining us the day before her start date. My boss had been out of the office for personal reasons, so I would be responsible for guiding Trisha through her first days of corporate onboarding, as well as figuring out a curriculum of training.

Trisha came to us with no professional training, but before she applied for the internship, we talked to determine whether a career in software testing might be a good fit for her. I guided her through a few resources to help her get a feeling for what a testing career might look like, but otherwise she was a complete testing novice.

Trisha joined us just as we were entering a very busy phase of our current project. The test team was on the last stretch of completing data import and scenario testing, while our developers were working on testing the last defects to be fixed before release. Our deadline was approaching, and there was little time to waste. Luckily, there was a self-guided corporate onboarding system waiting for her when she arrived.

Pairing with the Development Team

Adding a new team member, Software testing internship, Tester intern, Training new tester, Training testing intern, Changing team dynamics, Boosting test team productivity. TestRail.

The morning was lost to setting up her computer and resolving some networking issues, but once she was set up she was able to spend the day working on self-guided activities. Thank goodness, because I had no idea what her training was going to look like yet.

By the next day, I had a half-baked idea of how we could begin Trisha’s training. I had always assumed that our next team member would be trained via pairing with other testers. However, in this phase of the project, I was manually checking a subset of values across databases. The other tester is an introvert, so it seemed unfair to force him to pair up. Since our developers were testing defects and are used to pairing, I asked around for volunteers to let Trisha be a rubber duck as they tested defects. Half of the team offered to help! This was a great way for Trisha to both become familiar with the team and see how developers think when they are testing.

Receive Popular Monthly Testing & QA Articles

Join 34,000 subscribers and receive carefully researched and popular article on software testing and QA. Top resources on becoming a better tester, learning new tools and building a team.




We will never share your email. 1-click unsubscribes.
articles

Observations, Oracles and Defect Reports

Adding a new team member, Software testing internship, Tester intern, Training new tester, Training testing intern, Changing team dynamics, Boosting test team productivity. TestRail.

The next week I finally finished checking data and was ready to start doing some testing worth pairing on, so I had Trisha begin pairing with me exclusively. As we worked, I asked her to think about differences between the way I was doing things and the way the developers did things. I asked her to write a few notes about it so I could learn from the developers and so she could learn about how we think.

Initially, her observations were surface-level: The developers used hot keys and other shortcuts more often. They are more likely to look into code to see what the behavior is. As we continued working, we talked about other ways we work differently.

Since our project involves comparing our legacy product to an equivalent new product, the temptation the developers often fell for was comparing the new product to what they thought the legacy product did, rather than using the legacy product to determine the expected behavior. In our pre-hiring conversation, Trisha and I had talked about how oracles are the thing that tell us what’s right and wrong when evaluating behavior. In this case, the legacy product was the oracle.

Occasionally during our testing, we would encounter a defect. We created the following guidelines for writing our defect reports:

  1. Make sure the title is clear and succinct. This is the first piece of information the product manager will use to determine whether the bug needs to be fixed.
  2. Clearly describe the problem in the body of the report. Make sure to describe why it’s a problem (i.e., identify your oracles).
  3. Include any possible workarounds that might let the product manager defer fixing the problem.
  4. Try to identify and make clear the risks presented by the defect. Is there data loss? Is this a display issue that can be fixed post-release, or a data issue that requires a change to our data import process?

By Trisha’s third week, we had moved on to testing fixed defects. Many of these defect reports had been written by our developers during a previous phase of testing. As we read the defects, we also critiqued them, compared them to our guidelines, and considered what we would change to make them better. We also read the defect reports and resolutions deeply to find cases where the developer who reported the defect and the developer who fixed the defect seemed to be talking about different things — such as a defect that referenced the work history list on an equipment detail page with a resolution that referenced a work history detail page.

A Boost in Productivity

Adding a new team member, Software testing internship, Tester intern, Training new tester, Training testing intern, Changing team dynamics, Boosting test team productivity. TestRail.

As an unexpected benefit, whenever I was working with Trisha, I could focus on testing instead of all the non-testing activities accruing on my plate. I was able to skip a few meetings and delay questions until the end of the day when Trisha left, so I was getting more done. I learned that I should have been working this way for months, and I came away with some time management skills. Overall, adding Trisha to the team has increased my productivity, and on a team as small as ours, that means team productivity also went up.

Before we hired Trisha, I was worried that adding someone at this point in the project would be a huge setback. But now that Trisha is on the team, I’m getting more done, and I’m feeling great about it because I’m helping someone learn a new craft. I’m motivated and it is helping me keep the team motivated. Even the developers have commented that they enjoy hearing me explain things to Trisha, because they learn something too. Having a new member on the team is exciting. I can only hope that onboarding the next team member goes this well!

This is a guest posting by Carol Brands. Carol is a Software Tester at DNV GL Software. Originally from New Orleans, she is now based in Oregon and has lived there for about 13 years. Carol is also a volunteer at the Association for Software Testing.

Test Automation – Anywhere, Anytime

Try Ranorex for free

In This Article:

Sign up for our newsletter

Share this article

Other Blogs

General, Agile, Software Quality

How to Identify, Fix, and Prevent Flaky Tests

In the dynamic world of software testing, flaky tests are like unwelcome ghosts in the machine—appearing and disappearing unpredictably and undermining the reliability of your testing suite.  Flaky tests are inconsistent—passing at times and failin...

Software Quality

Test Planning: A Comprehensive Guide for Success

A comprehensive test plan is the cornerstone of successful software testing, serving as a strategic document guiding the testing team throughout the Software Development Life Cycle (SDLC). A test plan document is a record of the test planning process that d...

Software Quality, Business

Managing Distributed QA Teams

In today’s landscape of work, organizations everywhere are not just accepting remote and hybrid teams—they’re fully embracing them. So what does that mean for your QA team? While QA lends itself well to a distributed work environment, there ar...