Since the 1980’s there has been an “either-or” mentality to software testing. Either you test the requirements or you test the design/code. It is a common myth that testing the requirements is all that is needed. The fact is that if requirements based testing were sufficient there would be no failed projects and no world events due to software failures. Clearly this popular testing approach is not working. The facts[1] show that organizations that develop reliable software on time do both requirements and design/code testing. This is why.
Testing only requirements may- at best -cover 40% of the code. Requirements based testing won’t cover:
· Endurance or peak loading (Caused the Iowa Democratic Primary Caucus and SCUD missile attack failures)
· Timing (Caused the Therac 25, 2003 Northeast blackout failures)
· Data definition (Caused the Ariane 5 and F22 International dateline)
· State transitions (Multiple events due to dead states, prohibited state transitions, etc.)
· Logic (Caused the AT&T Mid Atlantic outage in 1991)
· Fault injection (Incorrect fault handling caused the Apollo 11 lunar landing, Quantas flight 72, Solar Heliospheric Observatory spacecraft, NASA Spirit Rover failures)
· Requirements that are missing crucially important details (Another cause of the F22 International dateline failure)
Why is this approach so popular? Answer:
· Engineers have difficulty understanding “necessary but not sufficient”.
· People who don’t understand software engineering started this myth in 1980s because they assumed that testing all requirements is equivalent to testing all code.
· Software engineers hate to test design and code and hence do their best to propagate this myth.
So how does one change this popular but ineffective approach? Requirements management tools such as DOORS present obstacles for testing anything but requirements. Some alternatives include:
· Pull more details into the software requirements specification. The more detailed the SRS the more code coverage you get when testing.
· Include pictures and tables as informative information (#4 on top ten list)
· Develop testable requirements for what can go wrong (#8 on top ten list)
· Include testing of design (#6 on top ten list)
· Test the mission and not just one requirement at a time (#8 on the top ten list)