New tool for your toolbox

Actually, the title of this blog is a bit misleading.  It should read “a new toolbox for your toolbox collection” :)

If you’ve ever done a web app pen test, you know that it gets messy really quick.  Add in source code auditing, screen shots, movie shots, reporting, etc. etc. and you end up with tons and tons of tools running, large folders of data, and a headache when it comes time to put all this data into a presentable format.

Dinis Cruz is hoping to relieve some of this headache with his new OWASP O2 platform.  This single interface ties together source code auditing, some penetration testing tools, integration with 3rd party scanners (in the future), windows productivity tools, movie editor, and a whole lot more.

I installed it and have been playing with it.  As with any toolbox, there will always be things you would like to see, but this beta release (1.2) has a ton of features and hooks for many more.

So, go and try it!  You can get the code from http://www.o2platform.com/wiki/O2_Release/v1.1_Beta

!Dmitry

dmitry.chan@gmail.com

Share
  • Andre Gironad

    I retain being very skeptical. Not about O2 and certainly not about Dinis.

    I am skeptical about our initial approaches to application security. We seem to think we know how it is best to engage with assessments given the resources and skills we have. We are biased.

    In order to remove that initial bias and establish a baseline of success, I believe that we nee more instrumentation and less “approaches”. Both code review and penetration-testing are wrong — and certainly combining them is also wrong (although it took me awhile to come to this conclusion since it appears as a natural progression).

    I’d like to throw away CAT.NET, Fortify SCA, Ounce OSA/O2, Webinspect, AppScan, NTOSpider, Netsparker, Burp Suite Professional, et al. These tools are useless, especially on their own.

    What we need is to develop using dev-tester tools, especially if they already exist. We need to pair this test harness with instrumentation such as Fortify PTA. Over time, data mining this output info will be incredibly useful. This can even be done in-IDE with Eclipse (Tomcat, Glassfish, Jetty) or VisualStudio (2010 has a new, better IIS Express alternative to Cassini), providing metrics very early on — say, at early iteration demos before a magazine demo or equivalent.

    I’m all for integrating other methods against the above baseline. For example, I think bug hunts, CC1-4 style code auditing (as described in “The Art of Software Security Assessment”), and design reviews are especially important. I think we can pair a lot of tests with other styles of testing, such as usability testing, accessibility testing, et al. Certainly tools such as O2 (I really like the integration of OWASP CodeCrawler into O2 for those CC1-4 audits) are going to help — but it should not be the center of attention. Instrumentation should be the primary way to engage assessments and assessment data.

  • dmitryc

    Yes, we are biased. In order to remove the bias, I think we need to throw everything out and begin with “Why?” Then, take each step asking that same question. We make money doing what we do. I work full time for a security software outfit. I’m as biased as they come. I’m going to think about this some more while I’m on vacation….it’ll be a good chance to read “The Art of Software Security Assessment” :) Below are my knee-jerk responses to the rest of your post. It’s not thought out, so I reserve the right to come back in a few weeks with a better formulated posting :)

    Instrumentation is good. Existing dev-tester tools are good. I define “good” or “success” as “a means to find bugs and improve the security posture of your app/organization”.

    Are the other tools useless? At first blush, I’d say no. If we define success by finding bugs and making the application more secure, then these tools do work to some degree. Further, each organization is going to have varying levels of resources and skills to throw at the “application security” problem. Their individual solution will be relative to their resources at hand. Said another way, some organizations will only be able to afford to run the “point, click, and report” tools. If their security posture increases and the level of expense is in line with the value of the resource being secured, then I would call that a success given the parameters.

    From a pragmatic standpoint, we’re just not going to have the time/resources/access to running a dev test harness for X amount of time. There is also a spectrum of access that each of us will get on each consulting gig. If we focus in on “perfect world” scenarios, we are ill equipped to deal with “inperfect world” scenarios which are 99% of the time…The public has been sold on “see your network as an attacker does”…never mind that it’s much better to see your network as an insider does…We can deal with what should be or we can deal with what is…it’s imperfect – as are we.

    I’m going to order “The Art of Software Security Assessment”. I might blog more after reading the book :)

  • Jason

    Obviously security should start with developers and the initial coding of applications, but it shouldn’t end there.

    When CompSci degrees for coders include a ground floor integration of security, when security is so ingrained that it is no longer considered an add-on or separate step, then we can talk about throwing out the tools.