[Owasp-testing] [Owasp-codereview] Code Review project andCode-Scanning-Tool(s)

Eoin eoin.keary at owasp.org
Wed Jan 24 06:39:31 EST 2007


Yep, having it detached from an IDE (like eclipse/WSAD...) is better for me.
Can you send the intrested parties an overview on Orizon so
1. We know what is involved.
2. See if its suitable
3. Get a roadmap together and a plan of action.


On 24/01/07, Paolo Perego <thesp0nge at gmail.com> wrote:
>
> Orizon is born to be a framework for people who wants to build a code
> review tool. Indeed is itself a tool but it's main goal is to provide a set
> fo API.
> Orizon's approach is different from LAPSE one.
> Orizon doesn't rely onto eclipse for parsing java sources. It translate
> source file (Java, C# and in future others) into XML and applies XML checks
> over the translated source.
>
> In fact in the part of code review guide I'll contribute to write I'll use
> experience I'm gathering writing Orizon :)
>
> Again, since I'm mainly writing a framework not just a standalone tool and
> since I'm involved as you know in Code Review Project too, I'm ok to write
> Orizon to cover aspects that will come in our Code Review Guide :)
> This is in a very early development stage.. so I can make it grow as I
> want (and, IMHO very important in an indipendent way from any IDE both
> opensource than closed)
>
> thesp0nge
>
> On 1/24/07, Eoin <eoin.keary at owasp.org> wrote:
> >
> > The Static code analyzer approach would be a good place to start.
> > This would simply point to areas in code where potential vulns may
> > occur.
> >
> > I used a static code analyzer for years based on reg-ex and checkstyle
> > and it speeds up the code review process greatly which is what the tool is
> > meant to do.
> > (my opinions relate to a number of years of actually running a code
> > review team and  hence the idea of starting the code review guide.)
> > So should this be a separate project, part of the Lapse project (why are
> > we not using this?).?
> >
> > If it is to be part of the Code Review Project I think that....
> > I really would like to focus on the document in order for it to be a
> > base for the code review tool checks. (makes logical sense). No point in
> > having a tool that checks for X but the guide does not cover X.
> >
> > Also in the world of OOD can we not reuse LAPSE or other tools out there
> > already ;)
> >
> > -ek
> >
> >
> >  On 23/01/07, Jim Manico < jim at manico.net> wrote:
> >
> > > My understanding is that both of those products are
> > > static-code-analyzers and not manual code review helpers. We are
> > > talking
> > > about apples and oranges here.
> > >
> > > - Jim
> > >
> > > Dinis Cruz wrote:
> > > > Ok, so we all agree that we need a tool to aid the manual code
> > > review
> > > > process.
> > > >
> > > > Who is going to help?
> > > >
> > > > We already have two OWASP projects that are trying to tackle this
> > > > problem (OWASP
> > > > LAPSE Project
> > > > <http://www.owasp.org/index.php/Category:OWASP_LAPSE_Project >and
> > > > OWASP
> > > > Orizon
> > > > Project<http://www.owasp.org/index.php/Category:OWASP_Orizon_Project
> > > >)
> > > > so who wants to colaborate can either join these projects or start a
> > > new
> > > > project
> > > >
> > > > Dinis Cruz
> > > > Chief OWASP Evangelist
> > > > http://www.owasp.org
> > > >
> > > > On 1/23/07, Jim Manico <jim at manico.net > wrote:
> > > >>
> > > >>  >  For example, a tool that finds and flags all the encryption
> > > code is
> > > >> easy and valuable. Maybe it helps me navigate the code with
> > > "security
> > > >> goggles" on.
> > > >>
> > > >> I think this would be a great step forward in code security. I
> > > could
> > > >> image
> > > >> that a tool of this nature would let me flag blocks of code to fit
> > > in a
> > > >> certain security category (input validation, encryption, auth, etc)
> > > and
> > > >> color-code it in some way.
> > > >>
> > > >> Also, I could as an auditor set a master list of concepts that I
> > > need to
> > > >> search for in my manual audit, add audit-specific notations to
> > > code, and
> > > >> perhaps give me a checklist of concepts I need to "check off" for
> > > >> every jsp
> > > >> and servlet. Perhaps a flag to mark certain code as questionable,
> > > or
> > > >> needs
> > > >> further review.... Perhaps a comment layer so a team of auditors
> > > could
> > > >> collaborate on the code review/audit process.
> > > >>
> > > >> Most of the commercial products out there that claim to be an Audit
> > > >> Workbench are really only static analysis tools, nothing I see out
> > > there
> > > >> really assists me with the manual code review process.
> > > >>
> > > >>  - Jim
> > > >>
> > > >>
> > > >> Jeff Williams wrote:
> > > >>
> > > >>  I know that there are exceptions (and let's keep the business
> > > logic
> > > >>
> > > >>
> > > >>  vulnerabilities out
> > > >>
> > > >>
> > > >>
> > > >>  of this one) but most issues should be detectable.
> > > >>
> > > >>
> > > >>
> > > >> I agree we should have a better framework for analyzing code for
> > > simple
> > > >> issues.  LAPSE is interesting, but is really a one-trick
> > > pony.  LAPSE
> > > >> does source-to-sink dataflow analysis, so it's pretty good for
> > > analyzing
> > > >> things like SQL injection and XSS.  But it has no ability to
> > > analyze
> > > >> encryption, logging, access control, authentication, error
> > > handling,
> > > >> concurrency, etc... And it only works on Java.
> > > >>
> > > >>
> > > >>
> > > >> I think "most issues should be detectable" is too aggressive (I've
> > > done
> > > >> quite a lot of work in this space).  That's what the commercial
> > > static
> > > >> analysis tool vendors are trying to do.  I suggest we focus on
> > > tools
> > > >> that assist the manual code reviewer, and DO NOT try to find
> > > problems
> > > >> automatically.
> > > >>
> > > >>
> > > >>
> > > >> For example, a tool that finds and flags all the encryption code is
> > > easy
> > > >> and valuable.  Maybe it helps me navigate the code with "security
> > > >> goggles" on.  A tool that attempts to analyze the encryption code
> > > and
> > > >> determine if it is sound is ridiculously hard and will have lots of
> > > >> false alarms.
> > > >>
> > > >>
> > > >>
> > > >> The tool must have some compiler-like features - at least symbol
> > > >> resolution, because grep is too inaccurate.
> > > >>
> > > >>
> > > >>
> > > >> --Jeff
> > > >>
> > > >>
> > > >>
> > > >>
> > > >> ------------------------------
> > > >>
> > > >> _______________________________________________
> > > >> Owasp-codereview mailing
> > > >> listOwasp-codereview at lists.owasp.orghttp://lists.owasp.org/mailman/listinfo/owasp-codereview
> > >
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> Best Regards,
> > > >> Jim Manico
> > > >> GIAC GSEC Professional, Sun Certified Java
> > > >> Programmerjim at manico.net808.652.3805
> > > >>
> > > >>
> > > >
> > > >
> > > > --
> > > >
> > >
> > > --
> > > Best Regards,
> > > Jim Manico
> > > GIAC GSEC Professional, Sun Certified Java Programmer
> > > jim at manico.net
> > > 808.652.3805
> > >
> > > _______________________________________________
> > > Owasp-testing mailing list
> > > Owasp-testing at lists.owasp.org
> > > http://lists.owasp.org/mailman/listinfo/owasp-testing
> > >
> >
> >
> >
> > --
> > Eoin Keary OWASP - Ireland
> > http://www.owasp.org/local/ireland.html
> > http://www.owasp.org/index.php/OWASP_Testing_Project
> > http://www.owasp.org/index.php/OWASP_Code_Review_Project
>
>
>
>
> --
> Diverso non necessariamente significa peggiore
>



-- 
Eoin Keary OWASP - Ireland
http://www.owasp.org/local/ireland.html
http://www.owasp.org/index.php/OWASP_Testing_Project
http://www.owasp.org/index.php/OWASP_Code_Review_Project
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.owasp.org/pipermail/owasp-testing/attachments/20070124/7988aa5d/attachment.html 


More information about the Owasp-testing mailing list