[OWASP-TESTING] Comment on testing guide and contrib for part 2
mark at curphey.com
Wed Aug 4 07:29:24 EDT 2004
I hear what you are saying but running a web app scanner on a web app is like testing a cars safety by front impact testing only. What happens to side impacts? Rear impacts ? What happens to roll safety? Is it flame proof? Do the seats give off cyanide when set alight? Do the seat belts work? I am not saying they dont have place; you articulate some reasons why they may be appropriate but they are very very limited and very very innacurate.
We have to be careful to set the pace and say what SHOULD be done rather than WHAT is done today. OWASP has the ability to lead the market into doing the right thing. Thats part of the beauty of it and part of our duty in my opinion.
As I can't type as fast as my stomach wants me to move to the breakfast room, how about a teleconference next week to debate this whole issue ?BTW
I have an ASP.NET C# parser I wrote that is shaping up. The major code review players (Fortify, Ounce and Coverity) are all working on Java and ASP.NET modules for their platforms.
---- Orac <orac at uncon.org> wrote:
> I know the automated scanners are not good practice from a complete
> security solution and wouldn't rely on them as anything more than
> preliminary reconnaissance in a code review.
> The benefit is that for the investment of 30 minutes I can get a
> (admittedly very partial) view on the quality of some code that would
> never get reviewed otherwise, perhaps because the project is too small
> financially to pay for the work or because the Business Impact of a
> compromise of the system is too low. Unfortunately we need very much to
> prioritise the time taken over security as there are many systems and
> budgets are limited.
> From previous experience I have been able to grade suppliers based on
> previous code reviews / pen tests in such a way that I know when I have
> to fight to get involved because I don't think the supplier 'gets it'.
> Being able to automatically scan across all code would also give me
> another good measure to identify suppliers who I should worry about. It
> doesn't act as a test of security, passing an automated scan would
> still require further work on high Business Impact systems but it does
> allow me to to do more with less on the smaller ones that slip through
> the gaps :)
> I am in the very early stages of looking at a parser for ASP (classic
> at the moment as many of our suppliers have yet to move to .Net) which
> can hopefully be used to search for previously identified bugs or flaws
> so that it can get better over time. I have to date done this with
> regexes but due to differences in coding styles these tend to be
> supplier specific. If it ever gets useful I will share it but it's a
> long way from that at the moment.
> In any case a 3rd party (or at least End User) review test process
> would be fantastic. We have implemented security in to various levels
> of our vaguely ITIL compliant custom IT project process so I can't
> really punt that out but am willing to pitch in with advice on a
> standard approach.
> On 4 Aug 2004, at 11:49, Mark Curphey wrote:
> > I think there are a couple of things that might be worth considering.
> > 1. Consider testing to be split (down the road) for 3rd party reviews
> > and 1st party reviews.
> > The original intent of this project was from a 1st party perspective
> > ie I wor for bank x and want tobuild a testing program for my own
> > company.
> > I guess we have mainly people who do third party reviews on the list.
> > 2. Gary McGraw says it best;
> > "If you fail a penetration test you know you have a problem. If you
> > pass a penetration test you have no idea that you dont have a bad
> > problem."
> > I am working on building some tools that benchmark the automated
> > scanners. You know they typically find less than 10% of vulns in a
> > standard site right ?
> > ---- orac <orac at uncon.org> wrote:
> >> <snip>
> >> Given the time and cost of code reviews we just don't get to do them
> >> unless serious problems are thrown up by any testing that is
> >> performed.
> >> It is a frustrating position sometimes that could be helped by having
> >> automated scanning tools for glue-code languages like (in my
> >> experience
> >> here) ASP, JSP, T-SQL and PL-SQL. Being able to at least run an
> >> automated scanner over code that is unlikely to be manually reviewed
> >> can
> >> at least identify vendors who have code quality issues that can then
> >> be
> >> used as justification for more detailed work.
> >> Anyway just throwing in the perspective from the end users.
> >> Regards
> >> Orac
> >> <snip>
> > -------------------------------------------------------
> > This SF.Net email is sponsored by OSTG. Have you noticed the changes on
> > Linux.com, ITManagersJournal and NewsForge in the past few weeks? Now,
> > one more big change to announce. We are now OSTG- Open Source
> > Technology
> > Group. Come see the changes on the new OSTG site. www.ostg.com
> > _______________________________________________
> > owasp-testing mailing list
> > owasp-testing at lists.sourceforge.net
> > https://lists.sourceforge.net/lists/listinfo/owasp-testing
More information about the Owasp-testing