Security testing tends to take a different approach from that of traditional testing. In this column, Herbert Thompson invites us into his world and offers up a peek into an average day for an experienced security tester.
9:16 AM Our diligent security tester begins his day by reading bug reports. First, he examines functional bugs reported against the product the previous day. Unlike many of his co-workers, the security tester focuses on the low-severity bug reports. He has learned that the most severe security bugs often have subtle symptoms, like unspecified temporary files, extra network packets, and bad default values. Functional testing often misses these insecure side effects of normal application behavior. But even if they are noticed, they are usually reported as low-severity functional or performance bugs.
11:12 AM The security tester reproduces two of the interesting functional bugs logged by testers the previous day. The company's product includes a spell check feature, and the product will likely be deployed in a server-based, multi-user environment. The security tester comes across a bug report titled "Extra file created during the spell check operation, then deleted". This bug was classified as a possible performance issue and was designated as low-severity - a certain do-not-fix and do-not-investigate death sentence for any bug.
The wise security tester knows, however, that it is likely that the developers of the spell check feature knew very little about how the product as a whole works, and even less about the product's security concerns. He suspects that the temporary file was created to enhance performance and speculates that it contained some of the edited document's contents during its short lifespan. After reproducing the bug, he finds that this is indeed the case. "Is this a security problem?" our security-testing hero wonders. After all, the file was deleted after the spell check was done. Then, he remembers a vulnerability he had seen on BugTraq several months earlier-something to do with file permissions.
The tester checks the permissions on the document being spell checked and sees that they are appropriate-only the current system user or an administrator can read or modify the file. Next, he checks the permissions on the temporary file. Jackpot! The file is world-readable; meaning that any curious user on the system can read its contents. All an attacker would need to do is create a process that constantly looks for such a file in another user's directory. Then, as soon as spell check was run, grab the file. Searching through discarded functional bugs had turned up another security bug.
12:45 PM The tester microwaves and consumes two pepperoni pizza Hot Pockets™.
2:07 PM The tester opens an Internet browser and meticulously scours public vulnerability databases for the previous day's security vulnerabilities reported against all software products-not just the ones made by his company. The shrewd security tester spends an hour doing this every day and has found many more security bugs in his company's products as a result.
3:30 PM The tester gathers his team for the weekly threat brainstorming session. The tester has learned that security testing is only effective if those who are testing know the security risks to their application. One team member reports on interesting vulnerabilities found in their competitors' products during the past week. Another leads the team in threat modeling exercises, thinking through the risks to their application using models like STRIDE ( Writing Secure Code, 2nd Edition ) and those mentioned in How to Break Software Security . Based on those risks, the group brainstorms "hack-cases" (security-specific test cases) that must be conducted during the next week.
5:12 PM A new team member walks into the wise security tester's office. He's concerned and baffled by the group's non-rigid and non-specification-driven testing style. The wise security tester confidently replies, "Specification-based testing is important, but for