Incremental adoption of test-driven-development
While the path to automation is never a short one, there is a path that is less steep. There are shortcuts to take on the path to automation. While there are many viable options, we’ll share two shortcuts that have been put to use in our varied experiences at Devbridge. These shortcuts aim to provide a gradual, incremental pathway that should induce less frustration. Though these shortcuts defer some beneﬁts to make it easier to adopt the test-first methodology, they are still more beneficial than remaining with the test-late method.
Consider the simple case in which a team has no automation and takes a test-late approach.
The two shortcuts are:
Write automated tests first.
Write the test case descriptions ﬁrst.
It's important to view these shortcuts not as definitive testing approaches but rather as intermediate steps on the road to even more eﬀective methodologies, such as test-driven development.
Shortcut #1: Build automated tests first
A major inefficiency in the test-late approach is that many test cases are written once and then rewritten, once manually and again when they are automated. To prevent the wasted eﬀort, one shortcut to take is to write the code and then immediately write the automated tests.
The first shortcut entails the following:
Write the code.
Write the automated test.
Supplement with additional manual testing, as necessary.
One key to this shortcut's overall success is to strictly avoid any sort of manual verification of the automated test(s) build. Indeed, before completing the automated tests for a single iteration, only ﬁx the issues that arise during compilation or static analysis (since neither of these requires additional work to get feedback). Another important consideration is to keep the focus on user requirements, not coverage metrics. Don't stop writing tests until you gain full conﬁdence in the correct system behavior.
We strongly recommend working incrementally. Make some code changes and test those. Make some more changes. Then repeat the tests and adjust as necessary. An iterative approach is much more productive than writing code for a few days and testing large segments of functionality. A primary benefit is the repetitive, instantaneous feedback by which it is possible to make many successive improvements quickly.
What happens when you hold off?
If you postpone refactoring, more effort will be necessary to test, refactor, and retest. If testing feedback is delayed too much, it becomes tempting to ignore any feedback on the design.
Moreover, it's important to consider test automation for levels of testing. It's relatively unproductive to focus only on unit tests. Before taking this shortcut, it might be best to manually verify a few automated test cases to ensure that the automated testing is correct. After the tests have been automated, some manual verification helps verify that tests you have just written work. You'll also come to understand better which tests provide value and which don't.
Taking these steps will enable a gradual improvement of test automation skills to the point at which manual verification is no longer necessary. Though at ﬁrst the automated test suites might not give a high level of conﬁdence, it will get better as the team gains more experience. Ultimately, it's vital to decrease the amount of manual testing gradually.
With all things considered, this shortcut can help combat the frustration of writing automated tests. A team can be eﬀective more quickly than with conventional approaches to software testing. For most teams, it is likely to help produce high-coverage test suites and enable frequent refactoring.
Shortcut #2: Write the test cases before coding
A big hurdle in adopting a test-ﬁrst approach is the difficulty in writing tests for code that has not yet been written. Another shortcut is to begin by writing test labels. This method is a different way to put immediate focus on the tests while postponing the complexity of writing the complete tests.
The second shortcut entails the following:
Describe test cases as test labels.
Write code that satisfies the test cases.
Write tests that correspond to each test label.
First, explicitly yet succinctly list out the test cases for all of the functionality. Write these out as simple, empty test methods that contain only test case descriptions and failing assertions. When attempting to deﬁne tests before writing any code, think of test cases that would demonstrate that the software satisfies the user requirements. Include enough detail in each description to deﬁne the scope of the implementation and the acceptance criteria of the test case.
Each description should cover only a single business case without any ambiguity. A test case with a general description. Writing something such as "should handle login" isn't descriptive enough. Even a simple feature like a login often consists of complex authorization requirements and likely hides edge test cases. Using a description that is too simple will not clarify which edge cases the implementation should cover.
Avoid giving ambiguous descriptions since the result may be the possibility of neglect in implementing and testing some of the functionality. Waste of effort may result from unnecessarily over-building the code and writing extra cases. When writing each description, focus on deﬁning a single case. Some features may not always be user-facing, so it's necessary to identify the level to provide a sensible description.
Examples of test-cases descriptions for a login feature:
Should authenticate the user with correct credentials.
Should not authenticate a locked-out user with correct credentials.
Should trust a valid existing user session without prompting for authentication.
Next, write code that satisfies each of the test cases. At this point, avoid an attempt to verify any changes manually since this would introduce the disadvantages stemming from the test-late approach. The task here is to ensure that the code will compile and begin to implement the test cases. While writing the test cases, various functional and structural issues will arise in the code. Be sure to fix each of these immediately and incrementally build a suite of passing test cases.
After implementing all the test cases to the point of passing, the developer attains high confidence in the code's quality. If there is some distrust remaining, it's entirely appropriate to manually verify some of the test cases. Eventually, it will be possible to identify and automate all of the essential test cases.
Advance incrementally and iterate
A logical next step is to proceed by elaborating the test cases incrementally and iteratively, along with small code changes. Working in short iterations generally results in a complete test case in smaller bits instead of working linearly to build the entire test suite all at once.
With the first shortcut, code is written earlier while maintaining a focus on writing tests iteratively with coding incrementally. Applying an iterative approach to a single test case is similar to test-driven development. The scope of each change is considerably smaller, and it's sensible to make adjustments to the code immediately after writing it. The feedback is nearly as quick as with test-driven development.
A proven best practice is starting with as many or as few tests in a single iteration. Gradually shorten each iteration until the point at which it is possible to work within the iterations of a single test case. The first few attempts may require additional time because of the effort to learn how to be efficient with a continuous stream of small refactorings.
With the second shortcut, the test suite design is driven purely by thinking about requirements and edge cases. Since the test case definitions occur before writing any code, the test cases should cover the expectation of what the code should do—rather than covering the implementation. It's impossible to predict test coverage until the code is complete. The second shortcut method helps avoid the effects of any coverage requirements and mitigates the risk that too few tests will be written. The important cases will be omitted. The code is likely to have some effect on writing the tests. But it will still be easier to identify module boundaries and refactor code when necessary.
Over time, perhaps the most significant benefit is that the tests become an integral part of the development effort. No longer will testing be the last step in a process delaying delivery. In addition, a keen focus on requirements ensures that tests specify system behavior, which is preferable to a set of isolated details that bring very little system understanding and low levels of confidence.
A test-first automated test suite specifies all high-priority cases, instills high conﬁdence in quality, and keeps maintenance eﬀort to a minimum.
One question that might be annoying to a team under pressure is this, "How much of an investment can be given to structural changes?" An answer that may be surprising is that at ﬁrst a test-first method might not have any signiﬁcant impact on the design. However, after a team gains experience working in short iterations, it may realize that structural improvements will develop indirectly as a result of immediate feedback and subsequent refactoring.