[Owasp-testing] [Owasp-codereview] Code Review project andCode-Scanning-Tool(s)

Dinis Cruz dinis at ddplus.net
Mon Jan 22 09:48:55 EST 2007


I agree that any source-code analysis tool will have limitations (and in
fact KNOWing what those limitations are is as important as the limitations
itself).

BUT since we are talking here about source-code-review guidelines, where
there is a 'guideline' about something to look at, I think that most of
those 'look for this' actions can be automated. In most cases the
code-automation tool will just a grep on steroids where it only indicates
'areas to look at and review', but in others (specially in VM based
languages like .Net and Java) it should be able to provide better results.

I agree with the compiler concept, and in .Net and Java I would also use the
'sandbox' paradigm where code that is able to be executed in 'safe'
sandboxes can be automatically considered to be 'safe from a series of
vulnerabilities.

I think all I am saying is that with the current focus on the Code Review
project it would be a great opportunity to co-develop such
code-auditing-tool (which 99% of current developers desperately need). And
for the technical difficulties of implementing such tool, If I can, I will
be very active in resolving them

Dinis Cruz
Chief OWASP Evangelist
http://www.owasp.org

On 1/22/07, Jeff Williams <jeff.williams at aspectsecurity.com> wrote:
>
>  > I know that there are exceptions (and let's keep the business logic
> vulnerabilities out
>
> > of this one) but most issues should be detectable.
>
>  I agree we should have a better framework for analyzing code for simple
> issues.  LAPSE is interesting, but is really a one-trick pony.  LAPSE does
> source-to-sink dataflow analysis, so it's pretty good for analyzing things
> like SQL injection and XSS.  But it has no ability to analyze encryption,
> logging, access control, authentication, error handling, concurrency, etc…
> And it only works on Java.
>
>
>
> I think "most issues should be detectable" is too aggressive (I've done
> quite a lot of work in this space).  That's what the commercial static
> analysis tool vendors are trying to do.  I suggest we focus on tools that
> assist the manual code reviewer, and DO NOT try to find problems
> automatically.
>
>
>
> For example, a tool that finds and flags all the encryption code is easy
> and valuable.  Maybe it helps me navigate the code with "security goggles"
> on.  A tool that attempts to analyze the encryption code and determine if it
> is sound is ridiculously hard and will have lots of false alarms.
>
>
>
> The tool must have some compiler-like features – at least symbol
> resolution, because grep is too inaccurate.
>
>
>
> --Jeff
>



--
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.owasp.org/pipermail/owasp-testing/attachments/20070122/7425c83a/attachment.html 


More information about the Owasp-testing mailing list