This is a guest post by Cameron Laird.
As older testing chores are automated, a test team can concentrate on higher-order goals. For many testers, these days that means collaborating with developers to strive toward continuous testing (CT) targets.
Keep in mind that your biggest contributions might seem routine to you. Many things obvious to you as a testing professional are novel or surprising to team members who come from development domains. When you’re thinking something, say it aloud, in explicit language. You’re likely to discover that your experience in testing has created insights that are valuable to others from different backgrounds, once properly explained.
Here are some specific ways testers can help programmers keep continuous testing goals in mind.
Consider one of the oldest concepts in testing: testability. Development teams often rate two alternative codings as equally stylish or elegant, even as one turns out to be more testable than the other. For programmers, either alternative is acceptable. For the long-term of the project as a whole, though, the choice that makes testing easier and more robust is a clear winner.
Login provides a simple, common example of this distinction. A routine requirement of a user interface is to receive an account and password, compare them to entries in a database, and either accept or reject the pair. Naive analysis of such a requirement quickly runs into coordination and mocking difficulties, including how to drive a graphical user interface (GUI) and construction of a mocked database.
A testability-oriented analysis decomposes naive implementations, though, into separate components that:
- Retrieve an account and password from the user interface (UI)
- Validate the pair against a datastore
- Act on the validity of the pair
- Are connected to the production elements — the real database, for instance — representative of the system in operation with real customers
Testing the first component in isolation becomes easier to script, and plenty of testing tools can robustly drive such an interaction. The second component is easier to test when it’s more general and not tied to a specific database instance. The third component, especially the error response, can be exercised without involvement of any particular database.
One of programmers’ instincts is to “simplify” the implementation of such a requirement by binding the different components together tightly. A proper testing perspective, in contrast, helps recognize that slightly smaller, more loosely coupled components become easier to test in isolation.
Such a decomposition almost makes it more feasible to test different components with different rhythms: Error-handling is important to include in CT suites that are frequently verified, while a more expensive integration test that relies on specific database credentials might be adequately exercised perhaps just once daily.
The Big Picture
However enthusiastic about and proficient with CT programmers are, they typically think of CT in programming terms. They might understand, for instance, that it’s important to test not only “happy paths,” but also the exception-handling of an application or service. Well-trained development teams generally can be expected to implement CT that exercises error responses.
A misspelling in an error message, though, has the potential to fester in CT without correction for quite a while. Programmers in such a circumstance typically judge that the application is correctly responding to an error with content that is the responsibility of someone else: The program’s right, and the message is “out of scope.”
Testers need a more holistic and customer-centered perspective that recognizes that regardless of whoever originally constructed the message, it needs to be readable when end-users see it. The nice thing about such a situation is that CT can be part of the solution!
CT can be configured not just to perform the unit tests familiar to programmers, but also to spell-check or even grammar-check all information delivered to customers. Thinking about values outside the narrow confines of how to code and test them is a great role for testers. Investment in CT should only increase the returns to a testing perspective, not eliminate them.
Wise testers help a product team analyze and implement plenty of other assessments within a CT framework. If the development team has a goal of keeping code quality high, they can implement automatic measurements of style and complexity within CT.
If the product generates HTML or XML (or JSON, XLS, etc.), validate those outputs automatically within CT. Scan the project for security or licensing vulnerabilities within CT, rather than have them turn up as surprises during compliance season. Schedule tests of variant environments — inside containers, or under new trial releases of an operating system — as automated complements of mainline CT.
Not all of a product’s requirements can be expressed in CT. Even among those that are logically possible, a few are poor fits for CT: a performance requirement that takes an hour to confirm shouldn’t hold up every individual commit of working programmers, for instance. At the same time, the usual recipes for CT fall far short of covering the range of qualities and values a good testing department respects.
Testers, look for ways to help your programming teammates capture those values in CT. In particular, always be on the lookout for chances to educate your team to:
- Implement testability
- Take the customers’ perspective
- Validate formats other than just source code
- Scan for security vulnerabilities and license compliance
- Prepare variations of CT for platforms of interest, such as containers or prospective operating systems
- Adapt product requirements to testable scales of time or size
CT isn’t the end of professional testing; it’s another realm with its own opportunities for and challenges to testing.
Cameron Laird is an award-winning software developer and author. Cameron participates in several industry support and standards organizations, including voting membership in the Python Software Foundation. A long-time resident of the Texas Gulf Coast, Cameron’s favorite applications are for farm automation.
- Announcing TestRail 6.3 with Enhanced Jira Integration
- TestRail Leads in the Spring 2020 G2 Grid for Test Management