[Owasp-topten] top 10 & testing guide

Dave Wichers dave.wichers at aspectsecurity.com
Wed Jan 6 00:33:33 EST 2010



OWASP is just starting a synchronization effort between the Top 10,
ASVS, and all the Guides. We are trying to use the ASVS requirements as
the baseline and then developing the dev guide and testing guide and
code review against that outline.  However, we don't want to wreck what
you guys have been doing with the testing guide #'s


Mike Boberski is working with Andrew van der Stock to launch an update
effort to the Dev Guide. Can you work with Mike so he understands how
you are using the OWASP finding #'s to see if we can move forward in a
way that is not massively disruptive? Mike may not even be aware of the
testing guide numbering scheme.


And we can also make sure that the dev guide covers everything you think
needs to be covered (which hopefully already is covered in ASVS), and if
not, maybe it needs to be updated too.




From: owasp-topten-bounces at lists.owasp.org
[mailto:owasp-topten-bounces at lists.owasp.org] On Behalf Of Brad Causey
Sent: Tuesday, January 05, 2010 8:59 PM
To: owasp-testing at lists.owasp.org; owasp-topten at lists.owasp.org
Subject: [Owasp-topten] top 10 & testing guide


First of all, sorry for the x-posting, but it seemed appropriate.

For those of you that don't know, I work in the financial sector and
developed our organization's WAS testing procedures, documentation, and
probably 80% of our whole WAS program from OWASP materials. Great stuff.

As matter of fact, each of our analysts has a LULU printed copy of the
testing guide on their desks. When we write reports up, we use the
OWASP-XX-XX as our classification mapping. For example:

Finding 1 - rXSS - OWASP-DV-001 - hxxp://www.vulnsite.com?msg=<blah
blah, you get it> - screenshot1.png

When we write our long form reports, we use the text from the testing
guide. It has really proven great for us and we've been doing this since
v2 came out. In addition, we have previously used the top ten literature
as supplementary in proving higher risk, higher priority items. That has
worked great until now.....

A8 on the RC version of the Top Ten throws a nice shiny wrench into it
all. Reason being, there isn't a corresponding OWASP-xx-xx
classification that matches up to A8. Now I've been writing A8 up for
some time, but it never had a nice-neat home in any of the Testing guide

Now that I've gotten past all that. I'd like to maybe discuss how,
possibly in the future, the two projects could be somewhat more in sync.
I'm not sure there is a good way to do that today, but it sure makes
sense in my mind that all owaspy stuff have some overlap, and should
avoid gaps such as the A8 vs OWASP-XX-XX situation. 

Also I see some gaps here:


That aren't covered in any OWASP documentation, and should be. I'd like
to get everyones' thoughts, and probably flames, on this stuff.

-Brad Causey

Never underestimate the time, expense, and effort an opponent will
expend to break a code. (Robert Morris)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://lists.owasp.org/pipermail/owasp-topten/attachments/20100106/55214add/attachment.html 

More information about the Owasp-topten mailing list