I’m a software developer. Not the best, but I give a shit.
Early in my career, my ratio of development to testing time was typically 1:3. When combined with planning, auditing, and meetings, it was surprising how little time was spent “doing my actual job”. As a result, my projects always took a longer than other teams and it was common for management to ask “why can’t you meet your self imposed deadlines”. 😕
If you give a shit, this type of feedback obviously hurts. To add insult to injury, the software I was developing was used to support the U.S.’s intelligence gathering mission. Way to let down your country, Kyle!
Thankfully, my projects worked like a champ when it came to operational use (where it mattered most). I’m sure you can imagine the stakes when your software crashes on an unsuspecting actor’s computer. 🕵️🔥🖥️
While my coworkers were busy debugging crashes and tracking down race conditions, I spent my time learning how industry leaders (Microsoft and later Google) were creating their significantly more complicated software in similar amounts of time. From this, I learned my approach to Agile Software Development was functional, but way too stinkin’ rigid. Oh the irony: a computer hacker—responsible for abusing unintended functionality — was lacking creativity. More specifically, I was spending way too much time perfecting unit tests and mocks, and not leveraging enough automated testing.
To best illustrate this, let’s look at Intel Security/McAfee’s build phase of their Agile development lifecycle:
Notice how the build and functional testing cogs are nearly proportional in size with smaller time segments spent on code reviews and static analysis tools? These steps were nearly identical to my approach. Unfortunately, my functional testing process was an order of magnitude larger and didn’t include dynamic or fuzz testing.
When it comes to introducing new testing methodologies into a development process, I can’t stress how important it is to measure. Since my previous sprints were using a somewhat consistent weight of points, I had years worth of burndown charts to determine how much time I spent on functional testing. After a bit of analysis, it was clear that a ridiculous amount of time was spent maintaining complicated mocks and addressing quality assurance issues. With this data, I got management approval to spend two months creating a Python based dynamic testing framework which would repay its development debt within a year.
In its simplest form, the new testing framework would run our compiled code and provide a templated way for developers to exercise their intended functionality (e.g. given an input and a system under certain conditions, assert the output is as expected). Since this framework was written in Python, development was significantly faster than writing in C. As a result, my team completed the project, whipped out a helper library, and integrated the tests into our Continuous Integration solution within the allotted time. We also uncovered six new bugs in two high value projects which repaid the development cost in quality assurance gains alone. 😎
Although dynamic testing was a huge win for my organization, bugs still slipped through the cracks. It became clear that we needed something automated to complement our “expected functionality” testing.
As a part-time exploit developer, I was familiar with the common mistakes which lead to faulty/vulnerable software. When in the mood, I’d typically browse through open source projects and manually review code for insecure strcpy’s or integer promotion bugs. However, when I didn’t have source code, I’d complement my reverse engineering efforts by creating a script to rapidly send malformed data to the compiled application. When the program inevitably crashed, I’d review the stack trace and core dumps to uncover bugs with minimal effort. This process is often called “fuzz testing” or “fuzzing”.
With this idea in mind, I proposed we integrate fuzzing into our build process. To my surprise, there was a fair amount of push back since fuzzing was largely regarded as “more art than science”. Thankfully, cooler heads prevailed and we implemented a combination of mutative and generative-based fuzzers. Once again, we uncovered a slew of unexpected bugs that were largely discovered without any interaction (our fuzzer constantly ran while we focused on development). Management rejoiced. 😀
Nowadays, fuzzing is a widely accepted practice that effectively discovers software defects when implemented correctly. Just take a look at Microsoft’s official software development lifecycle or their Project Springfield.
Over the course of a few years, I managed to reduce my development to testing ratio to 1:1 by adding dynamic and fuzz testing to my Agile SDL practices. The quality of my software also improved significantly. Unfortunately, not everyone gives shits equally. This was painfully obvious when Chris Bisnett and I were recently asked to comment on the 200+ vulnerabilities discovered in various Trend Micro products. Since that discussion, two questions continue to pop-up in my conversations:
- How does a security company allow these vulns to happen?
- How can companies avoid flaws like these in their products?
Although there’s dozens of different excuses, I know first-hand that most companies do not embrace automated testing like they should. Since concepts like Test-Driven Development may seem impossible to implement under tight product deadlines, I’d highly encourage any development shop with similar quality assurance issues to consider the route we took. A great place to start are the following books:
If fuzzing is something up your alley, consider attending Fuzzing For Vulnerabilities! Within two days, Chris Bisnett and I cover all the nitty-gritty details around the most efficient fuzzing methodologies. Each student will also walk away with a personal Binary Ninja license. Our next class is at Black Hat USA 2018 this August in Las Vegas. However, private sessions are also available upon request (just drop me an email at kyle[at]huntresslabs.com).
I hope this blog helps push you towards better software with less effort. For me, these practices helped boost my confidence just as much as my code quality. Please don’t hesitate to hit me up on Twitter if you have any questions!