[Owasp-testing] [Owasp-codereview] Code Review project andCode-Scanning-Tool(s)

Eoin eoin.keary at owasp.org
Mon Jan 22 16:21:57 EST 2007

On 22/01/07, Dinis Cruz <dinis at ddplus.net> wrote:
> I agree that any source-code analysis tool will have limitations (and in
> fact KNOWing what those limitations are is as important as the limitations
> itself).
> BUT since we are talking here about source-code-review guidelines, where
> there is a 'guideline' about something to look at, I think that most of
> those 'look for this' actions can be automated. In most cases the
> code-automation tool will just a grep on steroids where it only indicates
> 'areas to look at and review', but in others (specially in VM based
> languages like .Net and Java) it should be able to provide better results.

*Agreed, I believe we should start in a basic manner. Even if this means
writing a rule-set based on OWASP guide best practice. This would be akin to
CIS guidelines and assocuiated tools.*

I agree with the compiler concept, and in .Net and Java I would also use the
> 'sandbox' paradigm where code that is able to be executed in 'safe'
> sandboxes can be automatically considered to be 'safe from a series of
> vulnerabilities.

*Sandboxing is great, believe me, but I dont think sandboxking can excuse
bad design of code implementation.*

I think all I am saying is that with the current focus on the Code Review
> project it would be a great opportunity to co-develop such
> code-auditing-tool (which 99% of current developers desperately need). And
> for the technical difficulties of implementing such tool, If I can, I will
> be very active in resolving them

*Totally agree and is a very important too. as it was mentioned before that
a group such as OWASP has many Pen test tools but not so many secure
application development  tools.*
*My chosen  avenue would be a "Grep on Steroids" (Static code analysis)
initially. But  we should consider as part oft he roadmap to include dynamic
code analysis (Much more difficult). And also start with Java but design the
tool to be extensible so we could include C#, C++ etc in the future.*

Dinis Cruz
> Chief OWASP Evangelist
> http://www.owasp.org
> On 1/22/07, Jeff Williams <jeff.williams at aspectsecurity.com> wrote:
> >
> >  > I know that there are exceptions (and let's keep the business logic
> > vulnerabilities out
> >
> > > of this one) but most issues should be detectable.
> >
> > I agree we should have a better framework for analyzing code for simple
> > issues.  LAPSE is interesting, but is really a one-trick pony.  LAPSE does
> > source-to-sink dataflow analysis, so it's pretty good for analyzing things
> > like SQL injection and XSS.  But it has no ability to analyze encryption,
> > logging, access control, authentication, error handling, concurrency, etc…
> > And it only works on Java.
> >
> >
> >
> > I think "most issues should be detectable" is too aggressive (I've done
> > quite a lot of work in this space).  That's what the commercial static
> > analysis tool vendors are trying to do.  I suggest we focus on tools that
> > assist the manual code reviewer, and DO NOT try to find problems
> > automatically.
> >
> >
> >
> > For example, a tool that finds and flags all the encryption code is easy
> > and valuable.  Maybe it helps me navigate the code with "security goggles"
> > on.  A tool that attempts to analyze the encryption code and determine if it
> > is sound is ridiculously hard and will have lots of false alarms.
> >
> >
> >
> > The tool must have some compiler-like features – at least symbol
> > resolution, because grep is too inaccurate.
> >
> >
> >
> > --Jeff
> >
> --
> _______________________________________________
> Owasp-testing mailing list
> Owasp-testing at lists.owasp.org
> http://lists.owasp.org/mailman/listinfo/owasp-testing

Eoin Keary OWASP - Ireland
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.owasp.org/pipermail/owasp-testing/attachments/20070122/239bc0cb/attachment.html 

More information about the Owasp-testing mailing list