Exercise is hard – well, actually it’s not so much hard as it is tedious and easy to avoid. The exercise itself is straightforward, but doing it regularly is the hard part. As it turns out, software testing is similar to exercise in a lot of ways – it’s tedious, easy to skip, prone to binge behavior, and best done in consistent smaller amounts. And, of course, it’s very valuable and wildly effective at producing high quality software.
As we settle into 2017 and find out whether or not we’re going to be successful keeping our New Year’s resolutions, let’s be more proactive with our software testing habits, shall we?
Exercise works best when performed on a regular basis, even if the amount is slight. Just 10 minutes of exercise a day is better than 2 hours once a week. And what if you tried to do it all at once? Imagine if your goal was to exercise 30 minutes twice a week – a total of about 50 hours for the whole year. Would you wait until the last week of December and go to the gym 10 hours a day for 5 days straight? It sounds ridiculous, but oddly, this is the approach software developers and testers are taking in organizations all over the country and around the world.
Test suites that aren’t maintained become noisy and useless, and eventually become even less well-maintained. Static analysis being misdeployed as a testing tool provides a pretty bad noise-to-value ratio and will eventually be mostly ignored. The same is true for other core practices that should be part of your regular code exercise habits. The habits that will help improve our software safety, quality, and security are important no matter what kind of code you’re developing, whether it’s enterprise IT, embedded, IoT, medical, automotive, or aerospace. These best practices, deployed habitually, have proven themselves time and again across successful developers of all industries.
Having a second or third set of eyeballs look over your code can effectively help find serious problems before they get to the user. To ingrain the practice as part of your regimen, make sure you don’t overdo it. Don’t try to review an entire codebase, just review the changes related to one function or fix. Rely on static analysis to do the tedious work of style, best practices, and compliance enforcement.
Get a good preventative ruleset that you can directly link to problems you’ve had in the past and problems you want to avoid in the future. Don’t just run it in QA, make sure it’s on developers’ desktops where they get early notification and can fix problems. Any compliance issues, like functional safety, should be part of your static analysis configuration.
Write unit tests as you go – one for each file or one for each change. Writing them as you code is faster because you already know what the code is supposed to do. Make them robust so they won’t break if moved to another machine or run on another day.
The less frequently you run tests, the noisier they become – it’s the second law of thermodynamics in action. Make a real effort to slowly trim the noise each time you release. No one has the time to do it all at once, so just try to be a bit better this release than last time.
In addition to creating new unit tests, make sure the amount of code you’re testing is increasing by measuring your coverage. Work to get the number up incrementally – maybe 5% better this release than the last one. The way to get this number to go up is to take advantage of advanced testing technologies like service virtualization , to enable you to test complicated systems.
You can’t manage what you don’t measure. How do you know if things are getting better or worse? Gather data from your development activities, such as check-ins, bugs found, static analysis violations, coverage, etc. In the first pass, just gather it. After a couple of releases, you will be able to see if the numbers are going up or down. For more info on good metrics practices see my presentation on metrics that matter.
Take time after a release to review how it went. After 90 days, you have a good idea of initial quality and can make some assessment. What went wrong? How can you avoid such things in the future? Use static analysis to put prevention in place for specific problems.
When improving your software testing practices, remember how hard it is to stick to new year’s resolutions. We all need to improve, but don’t bite off more than you can chew. Figure out where to start and fix one thing at a time. Less is more – create small but consistent habits for maximum impact.
Arthur has been involved in software security and test automation at Parasoft for over 25 years, helping research new methods and techniques (including 5 patents) while helping clients improve their software practices.