[Owasp-testing] [Owasp-codereview] Code Review project andCode-Scanning-Tool(s)

Paolo Perego thesp0nge at gmail.com
Wed Jan 24 07:16:22 EST 2007


Sure.
Orizon borns in the late of september when I was collecting info and tools
to an italian speech I took for Owasp Italy chapter @ eAcademy (inside the
SMAU expo, http://www.smau.it/) in Milan. The speech was about application
security and code review.
I realized the importance about dynamic code review combined with static
pattern matching approach. I realized too that a dynamic approach needs a
sort of automatization, also if not all part of code review process could be
automatizated (and all of us we're happy about this).
I found a list of tools both opensource than closed like pdm, lapse,
fortify, flawfinder, rats. Wondering about these tools I realized that they
don't share information among them and sometimes they are also tied to the
environment used to write software.

The security checks not to be shared between tools arises me that there
would be the need of a sort of library, or a framework in which code
reviewer experience in writing tests will be translated in some way and
available to anyone who want to write a code review tool.

So I started planning Orizon with the idea to create an infrastructure, a
sort of code review engine usable by anyone who want to create a code
scanning tool. So Orizon greatest value will be the security check library
written by experts (we, guys @ owasp :)) and available to anyone's tool.
Orizon of course will be a tool using itself as framework (like a demo).
Orizon will be a tool not tied to IDE, so I started by now a command line
tool.

For static analysis Orizon won't be tied to programming language. That's
because I'll translate source code to be checked, in XML (by now only java
works but I'm planning to extend translation list to C#, C and other). Over
XML security checks (written in XML too) will be applied in a pattern
matching fashion.
In the architectural document (
https://www.owasp.org/index.php/Image:Orizon_internal_draft.doc) I planned a
sort of style check but this part needs further engineering over it.

For dynamic analysis Orizon will (better, it aims to):
* disassemble class or source in methods or functions
* create simple test tools that invokes method or function to be checked
supplying an input that in a first project stage will be created by reviewer
(I'm reading something about fuzzy logic and I'd like to implement something
but in a future)
* prints out unhandled exceptions, strange behaviour, ...

Of course results validation is an human task and demanded to reviewer.

By now I'm focused create the static part of the framework. I'll plan to
have something usable in the late spring (may 2007).

I'm planning to add also dynamic review in the summer and having something
full available next fall (maybe for the next AoC).

This is the old Orizon web site @ sourceforge:
http://orizon.sourceforge.net/
Sourceforge hosts SVN source repository.

This is the Owasp Orizon main page:
http://www.owasp.org/index.php/Category:OWASP_Orizon_Project
with mailing list reference.
At today 10 people are subscribed to mailing list but it generates very few
traffic but I think that's because the lack of fresh commits...

This is the Owasp Orizon Blog: blogs.owasp.org/orizon

Please everyone feel itself free to forward this overview to anyone he
thinks could interested on Orizon and please forgive me for my very bad
english :)

Tell me if this overview fullfills your questions or there is something more
you want to know :)

Ciao ciao
thesp0nge

On 1/24/07, Eoin <eoin.keary at owasp.org> wrote:
>
> Yep, having it detached from an IDE (like eclipse/WSAD...) is better for
> me.
> Can you send the intrested parties an overview on Orizon so
> 1. We know what is involved.
> 2. See if its suitable
> 3. Get a roadmap together and a plan of action.
>
>
> On 24/01/07, Paolo Perego <thesp0nge at gmail.com> wrote:
> >
> > Orizon is born to be a framework for people who wants to build a code
> > review tool. Indeed is itself a tool but it's main goal is to provide a set
> > fo API.
> > Orizon's approach is different from LAPSE one.
> > Orizon doesn't rely onto eclipse for parsing java sources. It translate
> > source file (Java, C# and in future others) into XML and applies XML checks
> > over the translated source.
> >
> > In fact in the part of code review guide I'll contribute to write I'll
> > use experience I'm gathering writing Orizon :)
> >
> > Again, since I'm mainly writing a framework not just a standalone tool
> > and since I'm involved as you know in Code Review Project too, I'm ok to
> > write Orizon to cover aspects that will come in our Code Review Guide :)
> > This is in a very early development stage.. so I can make it grow as I
> > want (and, IMHO very important in an indipendent way from any IDE both
> > opensource than closed)
> >
> > thesp0nge
> >
> > On 1/24/07, Eoin <eoin.keary at owasp.org> wrote:
> > >
> > > The Static code analyzer approach would be a good place to start.
> > > This would simply point to areas in code where potential vulns may
> > > occur.
> > >
> > > I used a static code analyzer for years based on reg-ex and checkstyle
> > > and it speeds up the code review process greatly which is what the tool is
> > > meant to do.
> > > (my opinions relate to a number of years of actually running a code
> > > review team and  hence the idea of starting the code review guide.)
> > > So should this be a separate project, part of the Lapse project (why
> > > are we not using this?).?
> > >
> > > If it is to be part of the Code Review Project I think that....
> > > I really would like to focus on the document in order for it to be a
> > > base for the code review tool checks. (makes logical sense). No point in
> > > having a tool that checks for X but the guide does not cover X.
> > >
> > > Also in the world of OOD can we not reuse LAPSE or other tools out
> > > there already ;)
> > >
> > > -ek
> > >
> > >
> > >  On 23/01/07, Jim Manico < jim at manico.net> wrote:
> > >
> > > > My understanding is that both of those products are
> > > > static-code-analyzers and not manual code review helpers. We are
> > > > talking
> > > > about apples and oranges here.
> > > >
> > > > - Jim
> > > >
> > > > Dinis Cruz wrote:
> > > > > Ok, so we all agree that we need a tool to aid the manual code
> > > > review
> > > > > process.
> > > > >
> > > > > Who is going to help?
> > > > >
> > > > > We already have two OWASP projects that are trying to tackle this
> > > > > problem (OWASP
> > > > > LAPSE Project
> > > > > < http://www.owasp.org/index.php/Category:OWASP_LAPSE_Project >and
> > > > > OWASP
> > > > > Orizon
> > > > > Project<http://www.owasp.org/index.php/Category:OWASP_Orizon_Project
> > > > >)
> > > > > so who wants to colaborate can either join these projects or start
> > > > a new
> > > > > project
> > > > >
> > > > > Dinis Cruz
> > > > > Chief OWASP Evangelist
> > > > > http://www.owasp.org
> > > > >
> > > > > On 1/23/07, Jim Manico < jim at manico.net > wrote:
> > > > >>
> > > > >>  >  For example, a tool that finds and flags all the encryption
> > > > code is
> > > > >> easy and valuable. Maybe it helps me navigate the code with
> > > > "security
> > > > >> goggles" on.
> > > > >>
> > > > >> I think this would be a great step forward in code security. I
> > > > could
> > > > >> image
> > > > >> that a tool of this nature would let me flag blocks of code to
> > > > fit in a
> > > > >> certain security category (input validation, encryption, auth,
> > > > etc) and
> > > > >> color-code it in some way.
> > > > >>
> > > > >> Also, I could as an auditor set a master list of concepts that I
> > > > need to
> > > > >> search for in my manual audit, add audit-specific notations to
> > > > code, and
> > > > >> perhaps give me a checklist of concepts I need to "check off" for
> > > > >> every jsp
> > > > >> and servlet. Perhaps a flag to mark certain code as questionable,
> > > > or
> > > > >> needs
> > > > >> further review.... Perhaps a comment layer so a team of auditors
> > > > could
> > > > >> collaborate on the code review/audit process.
> > > > >>
> > > > >> Most of the commercial products out there that claim to be an
> > > > Audit
> > > > >> Workbench are really only static analysis tools, nothing I see
> > > > out there
> > > > >> really assists me with the manual code review process.
> > > > >>
> > > > >>  - Jim
> > > > >>
> > > > >>
> > > > >> Jeff Williams wrote:
> > > > >>
> > > > >>  I know that there are exceptions (and let's keep the business
> > > > logic
> > > > >>
> > > > >>
> > > > >>  vulnerabilities out
> > > > >>
> > > > >>
> > > > >>
> > > > >>  of this one) but most issues should be detectable.
> > > > >>
> > > > >>
> > > > >>
> > > > >> I agree we should have a better framework for analyzing code for
> > > > simple
> > > > >> issues.  LAPSE is interesting, but is really a one-trick
> > > > pony.  LAPSE
> > > > >> does source-to-sink dataflow analysis, so it's pretty good for
> > > > analyzing
> > > > >> things like SQL injection and XSS.  But it has no ability to
> > > > analyze
> > > > >> encryption, logging, access control, authentication, error
> > > > handling,
> > > > >> concurrency, etc... And it only works on Java.
> > > > >>
> > > > >>
> > > > >>
> > > > >> I think "most issues should be detectable" is too aggressive
> > > > (I've done
> > > > >> quite a lot of work in this space).  That's what the commercial
> > > > static
> > > > >> analysis tool vendors are trying to do.  I suggest we focus on
> > > > tools
> > > > >> that assist the manual code reviewer, and DO NOT try to find
> > > > problems
> > > > >> automatically.
> > > > >>
> > > > >>
> > > > >>
> > > > >> For example, a tool that finds and flags all the encryption code
> > > > is easy
> > > > >> and valuable.  Maybe it helps me navigate the code with "security
> > > >
> > > > >> goggles" on.  A tool that attempts to analyze the encryption code
> > > > and
> > > > >> determine if it is sound is ridiculously hard and will have lots
> > > > of
> > > > >> false alarms.
> > > > >>
> > > > >>
> > > > >>
> > > > >> The tool must have some compiler-like features - at least symbol
> > > > >> resolution, because grep is too inaccurate.
> > > > >>
> > > > >>
> > > > >>
> > > > >> --Jeff
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >> ------------------------------
> > > > >>
> > > > >> _______________________________________________
> > > > >> Owasp-codereview mailing
> > > > >> listOwasp-codereview at lists.owasp.orghttp://lists.owasp.org/mailman/listinfo/owasp-codereview
> > > > >>
> > > > >>
> > > > >>
> > > > >> --
> > > > >> Best Regards,
> > > > >> Jim Manico
> > > > >> GIAC GSEC Professional, Sun Certified Java
> > > > >> Programmerjim at manico.net808.652.3805
> > > > >>
> > > > >>
> > > > >
> > > > >
> > > > > --
> > > > >
> > > >
> > > > --
> > > > Best Regards,
> > > > Jim Manico
> > > > GIAC GSEC Professional, Sun Certified Java Programmer
> > > > jim at manico.net
> > > > 808.652.3805
> > > >
> > > > _______________________________________________
> > > > Owasp-testing mailing list
> > > > Owasp-testing at lists.owasp.org
> > > > http://lists.owasp.org/mailman/listinfo/owasp-testing
> > > >
> > >
> > >
> > >
> > > --
> > > Eoin Keary OWASP - Ireland
> > > http://www.owasp.org/local/ireland.html
> > > http://www.owasp.org/index.php/OWASP_Testing_Project
> > > http://www.owasp.org/index.php/OWASP_Code_Review_Project
> >
> >
> >
> >
> > --
> > Diverso non necessariamente significa peggiore
> >
>
>
>
> --
> Eoin Keary OWASP - Ireland
> http://www.owasp.org/local/ireland.html
> http://www.owasp.org/index.php/OWASP_Testing_Project
> http://www.owasp.org/index.php/OWASP_Code_Review_Project
>



-- 
Diverso non necessariamente significa peggiore
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.owasp.org/pipermail/owasp-testing/attachments/20070124/41ebeb99/attachment.html 


More information about the Owasp-testing mailing list