The Seven Habits of Highly Insecure Software

[article]
Summary:

Severe functional bugs usually have pretty overt symptoms: an application crash, corrupt data, and screen corruption. Security bugs, though, usually have more subtle symptoms and habits. This article discusses the most common and difficult-to-notice symptoms of insecure software to help you track down these bugs during testing.

In his book, The Seven Habits of Highly Effective People , Stephen R. Covey identifies behaviors from some exceptional people who have proven themselves the most highly effective and productive. Covey observed these people, studied their habits, and taught these techniques and skills to others. In this article, we have taken a similar approach but with much more unsavory subjects of study: insecure software. My goal was to identify symptoms of insecure software, telltale signs that the application is behaving in a way that could create security vulnerabilities. By understanding how insecure software behaves, we can better detect and remove these types of bugs in our own software.

With this in mind, we scoured bug databases for the most malevolent and destructive security bugs ever to infest a released binary. We sifted through thousands of bug reports, incident reports, and advisories. Common characteristics started to emerge. Habits like writing out temporary files with secure data, leaving unprotected ports open, and foolishly trusting third-party components, just to name a few. The result was our (James Whittaker's and my) book, How to Break Software Security , (Addison-Wesley, 2003), which outlines prescriptive testing techniques to expose security bugs in software. This article summarizes some of the insecure habits we've found, and identifies the symptoms of the most common security vulnerabilities and how they can be detected and avoided.

Habit # 1: Poorly Constrained Input
By far, the number one cause of security vulnerabilities in software stems from the failure to properly constrain input. The most infamous security vulnerability resulting from this habit is the buffer overflow. Buffer overflows happen when application developers use languages (like C and C++) that allow them to allocate a fixed amount of memory to hold some user-supplied data. This usually doesn't present a problem when input is properly constrained or when input strings are of the length that developers expected. When data makes it past these checks, though, it can overwrite space in memory reserved for other data, and in some cases force commands in the input string to be executed. Other unconstrained input can cause problems, too, like escape characters, reserved words, commands, and SQL (Structured Query Language) statements.

Habit # 2: Temporary Files
Usually we think of the file system as a place to store persistent data; information that will still be there when the power is shut off. Applications, though, also write out temporary files-files that store data only for a short period and then are deleted. Temporary files can create major security holes when sensitive data is exposed. Common (inappropriate) uses of temp files include user credentials (passwords), unencrypted but sensitive information (CD-keys), among others.

Habit # 3: Securing Only the Most Common Access Route
How many ways could you open a text document in Windows? You could double-click on the file in Windows Explorer; or open your favorite text editor, and type the file name in the open dialog; or type the file name into an Internet Explorer window. The truth is, if you put your mind to it, you could think of at least a dozen ways to open that file. Now imagine implementing some security control on that document. You would have to think of every possible access route to the document, and chances are, you're likely to miss a few.

Developers fall into this dilemma too. When requirements change, or when a new application version is being developed, security controls are often "added-on" to an application. Also, when a security bug is reported, developers may patch the application to fix the particular input sequence reported and still leave other, underused

About the author

Herbert H. Thompson's picture Herbert H. Thompson

Herbert H. Thompson, Ph.D., is director of security technology at Security Innovation LLC. He earned his Ph.D. in mathematics from the Florida Institute of Technology. Herbert is co-author (with James Whittaker) of How to Break Software Security: Effective Techniques for Security Testing (Addison-Wesley, 2003), and is the author of numerous papers on software security and testing.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Nov 09
Nov 09
Apr 13
May 03