[Owasp-leaders] FW: REQUESTFOR DECISION/CALL FOR CONTRIBUTIONS TO UPDATE THE ASSESSMENTCRITERIA
mtesauro at gmail.com
Sat Mar 7 14:17:03 EST 2009
Eoin, Mike, et al,
Funny thing is that we are on the same page - we all want to focus on
quality. Thing is that when you start to plan how OWASP is going to get
there, you run into the issue we've raised.
OWASP has a large number of projects of various age, authorship and
quality. How to you go from this grab bag of projects to something that
lets the non-OWASP person decide what works for them? Remember we're
all insiders on this list - random Joe who surfs to www.owasp.org's
project page has no clue about the status and history of our projects.
The answer is to categorize the projects so they are findable by the
target audience and their quality level is readily apparent. Until
OWASP has metrics on all projects, you're still left with a project grab
bag. How to solve this?
(1) Let someone review all OWASP projects, try to contact project leads,
and go do an assessment of all of them. This could be a SoC project but
if you look at the scope, that's a HUGE project.
(2) Let the Global Projects Committee (GPC) sort them out. This is also
feasible if you don't care about time. With the current work on the
next SoC + the other agenda items for the GPC, this will be a long slow
(3) Let the project leads self-assess. This disperses the work load and
has the added benefit that it identifies dormant/dead OWASP projects
based on non-response. For those project leads that happen to be busy,
a response after the deadline can correct any temporary mis-classification.
I'd much prefer (3) above as it removes the external arbitrary rating
which is possible under (1) and (2) as well as dispersing the workload.
Additionally, it has the added benefit of identifying dormant
projects. Those dormant project _could_ then be considered for revival
during the next SoC. Also, this would allow potential external project
sponsors to easily assess what they are sponsoring.
The end result of all this _is_ increased quality and completeness for
OWASP as a whole.
The disagreement may be on the path we take, but everyone is working
towards the same goal.
-- Matt Tesauro
OWASP Live CD Project Lead
http://mtesauro.com/livecd/ - Documentation Wiki
> for what its worth I agree with Mikes sentiments:
> "I'd rather see people putting time/energy into tightening up their
> project pages, tools, and project presentations/datasheets. An example
> are PHP and .NET ESAPI, there's no published mapping of Java ESAPI to
> PHP/ESAPI, that also should then identify which interfaces are being
> targeted for which releases. I'm going to try to work with Andrew to fix
> that problem for PHP since I may have a need for a PHP ESAPI for a
> customer engagement, but it's still a good example. "
> We need to focus on quality and completeness. This shall enable more
> widespread adoption also.
> *From:* Boberski, Michael [USA] [mailto:boberski_michael at bah.com
> <mailto:boberski_michael at bah.com>]
> *Sent:* quinta-feira, 5 de Março de 2009 13:29
> *To:* Dave Wichers; paulo.coimbra at owasp.org
> <mailto:paulo.coimbra at owasp.org>; OWASP Foundation Board List;
> global_tools_and_project_committee at lists.owasp.org
> <mailto:global_tools_and_project_committee at lists.owasp.org>
> *Subject:* RE: [Global_tools_and_project_committee] [Owasp-board]
> FW: REQUESTFOR DECISION/CALL FOR CONTRIBUTIONS TO UPDATE THE
> Team, OWASP is getting overly bureaucratic, it seems to me.
> I'd rather see people putting time/energy into tightening up their
> project pages, tools, and project presentations/datasheets. An
> example are PHP and .NET ESAPI, there's no published mapping of Java
> ESAPI to PHP/ESAPI, that also should then identify which interfaces
> are being targeted for which releases. I'm going to try to work with
> Andrew to fix that problem for PHP since I may have a need for a PHP
> ESAPI for a customer engagement, but it's still a good example.
> The more complete and professional a page/doc/tool looks, the easier
> it is to identify the status and content of a doc/tool, the easier
> is to figure out its usefulness and to promote its adoption. That a
> doc/tool has correct content or works is taken as a given, that is
> completely secondary to the initial figuring out if a doc/tool is a
> potential solution to one's problem of the day.
> I would also caution against downgrading projects, which is what one
> of the comments seems to imply could happen. If you must address
> some perceived contention over project assessment criteria, you
> should simply put dates against ratings, and identify the criteria
> version that a project was assessed against, then leave that rating
> alone as the criteria continues to evolve over time. That is what
> more well-established and formal testing programs for instance like
> Common Criteria and FIPS 140 do. I hope I am misreading comments on
> this point however.
> Mike B.
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org <mailto:OWASP-Leaders at lists.owasp.org>
> Eoin Keary CISSP CISA
> OWASP Code Review Guide Lead Author
> OWASP Ireland Chapter Lead
> OWASP Global Committee Member (Industry)
> Quis custodiet ipsos custodes
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
More information about the OWASP-Leaders