Tester’s Diary: The Long, Slow Road to DevOps

TestRail, Testers Diary, DevOps

This is a guest post by Carol Brands.

Recently, our team has been moving toward a more agile way of working. We now have testers in the development team room, and our recent projects have included developers in our exploratory test strategy.

Our next team improvement is a move toward DevOps. It’s going to be a long haul for us.

One of the biggest impediments for us is that we don’t currently use automated checks as part of our testing strategy. Instead, as we test each story, we explore the surrounding features and interactions.

This allows us to effectively regression test as we move through each feature and user story, but in many ways, it also leaves our testing strategy subject to the pitfalls of waterfall development. The developers write unit tests and integration tests, but the exploratory testing can’t happen until a story is completed, pushed into the repository, and built onto a test environment overnight.

Once that small delay was required, it was easy for other delays to build up, which meant a story might not get tested for days — or even weeks, when things fall really behind. DevOps development requires much faster feedback than that, but same-day testing feels like it’s a world away.

Modern Test Case Management Software for QA and Development Teams

Another hurdle to overcome is that the test team doesn’t have visibility to the pipeline, which makes it hard to understand where automated tests will fit in. There are so many decisions to be made: where the test code will live; which tests will be run, and when, and how; and where that output will go. We’re starting from scratch, so the possibilities are endless and overwhelming.

The sheer number of tools to choose from is also exhausting. Testers don’t have access to the tools currently being used by the team to manage the repository, unit and integration tests, and builds, so it’s hard to make decisions about what additional tools we need. It’s even harder to understand how we will go from pushing code to running tests.

Beyond tools and decision-making, we will have to reconsider our way of working with developers. The purpose of including automation in the development pipeline is to gain fast feedback on changes as they are introduced.

Currently, we wait until the product is deployed before we begin testing. We’re going to need to find more ways to introduce testing before that happens, both by writing tests that will be part of the pipeline prior to deployment, and by writing tests that will be run after deployment against a UI. Writing tests before we can actually see the changes in the deployed program feels like a huge change, and it’s a little hard to imagine.

Even with all these worries ahead, there is hope for our team. Already the development team is including our test manager in discussions about how we can use our development management tool to reflect the new processes we will need to conduct. This is an improvement over our last development management tool, which was chosen and configured by the development team exclusively.

We also have included developers as exploratory testers in the past two products. They have started talking more about testing in their development meetings, which the testers are invited to and encouraged to participate in. The developers definitely understand how frustrating it is to find broken software when attempting to test a “ready for test” story. When you hear a developer say out loud, “Did this ever work?” you know they’re learning something.

We’ve made great strides over the past year, and I have confidence that we will continue to make great strides in the coming year as we move toward not just an agile method of development, but a true DevOps team.

All-in-one Test Automation Cross-Technology | Cross-Device | Cross-Platform

Carol Brands is a Software Tester at DNV GL Software. Originally from New Orleans, she is now based in Oregon and has lived there for about 13 years. Carol is also a volunteer at the Association for Software Testing.

Comments