[Owasp-leaders] OWASP Top 10 - Proposal for a Temporary Injunction
michael.coates at owasp.org
Tue Feb 26 19:08:38 UTC 2013
Thanks for that info. I think that helps provide some background, but
Jerry's first two questions are still very relevant and the current info we
publish doesn't seem to address them:
- What is the methodology used to decide on the "Top 10 risks"?
- Who exactly is involved in the selection and ordering of these risks?
One concern I have is that we (OWASP) don't know this information either.
The place where the conversation should have happened - the top ten mailing
list - does not have any information.
To reiterate Jerry's questions, where was the top 10 discussed? How as it
done? And who was there involved in the decision making (not just providing
Michael Coates | OWASP | @_mwc
On Tue, Feb 26, 2013 at 10:29 AM, Ryan Barnett <ryan.barnett at owasp.org>wrote:
> The page here (https://www.owasp.org/index.php/Top_10_2013-Introduction)
> lists the following -
> *"The OWASP Top 10 is based on risk data from 8 firms that specialize in
> application security, including 4 consulting companies and 4 tool vendors
> (2 static and 2 dynamic). This data spans over 500,000 vulnerabilities
> across hundreds of organizations and thousands of applications. The Top 10
> items are selected and prioritized according to this prevalence data, in
> combination with consensus estimates of exploitability, detectability, and
> impact estimates."*
> And the Acknowlegments section on that same page lists the 8 data sources:
> - Aspect Security
> - HP (Results for both Fortify and WebInspect)
> - Minded Security
> - Softtek
> - TrustWave
> - Veracode – Statistics
> - WhiteHat Security Inc. – Statistics
> My concerns are not with the data sources provided, but that they are
> mainly factoring in only vulnerability prevalence. We need to factor in
> more data sources related to attack frequency/liklihood. Some example
> - WASC Web Hacking Incident Database
> - Akamai State of the Internet Reports
> - Firehosts Web Application Attack Reports
> - Imperva's Web Application Attack Reports
> This is just a few examples of real attack data that we should consider
> with regards to both which items are included and for priority listings.
> From: Michael Coates <michael.coates at owasp.org>
> Date: Tuesday, February 26, 2013 1:05 PM
> To: owasp-topten-project <owasp-topten-project at owasp.org>
> Cc: OWASP Leaders <owasp-leaders at lists.owasp.org>
> Subject: Re: [Owasp-leaders] OWASP Top 10 - Proposal for a Temporary
> Given the importance of the OWASP top 10, its use throughout the world,
> and the fact it is one of our initial points (for many organizations) of
> spreading the OWASP mission I think these are valid questions.
> Can someone from the top 10 project provide insight on these questions? Do
> we have this type of information already published? If not, we should
> capture and publish. Other organizations will likely want to better
> understand both our methodology and comprehensive understanding of the
> landscape that led to our recommendations.
> Michael Coates | OWASP | @_mwc
> On Tue, Feb 26, 2013 at 1:47 AM, Jerry Hoff <jerry at owasp.org> wrote:
>> Hello leaders!
>> As we all know, the OWASP Top 10 - 2013 release candidate list was made
>> available on Feb 15, 2013. Since then, there has been a lot of controversy
>> on a number of points. The most pressing - in my opinion - is the
>> methodology used to come up with the top 10.
>> The OWASP top 10 is BY FAR the most visible and recognizable project in
>> OWASP. It is used as a de facto standard by which countless organizations
>> around the globe measure their application security. The issues that are
>> covered by the Top 10 will receive the vast majority of attention from
>> security teams around the world - if there are redundant or needless
>> entries, huge amounts of money will be wasted, and if important risks are
>> not listed on the top 10, they will be largely ignored.
>> That said - I would like to raise the following issues:
>> - What is the methodology used to decide on the "Top 10 risks"?
>> - Who exactly is involved in the selection and ordering of these risks?
>> - What assurance do organizations have that this list is a true
>> reflection of webapp-related risk
>> - Since this is OWASP's most visible flagship project, what assurance do
>> WE in the OWASP community have that this list is an accurate accounting of
>> the top 10 risks intrinsic in web apps?
>> Folks - OWASP has grown extremely rapidly in a short amount of time, and
>> as the entire industry matures doing things "like we've always done them"
>> is not sufficient in my opinion.
>> This most popular project - that essentially speaks and represents all of
>> OWASP - should not be selected arbitrarily and released publicly without
>> answers to the questions above.
>> Therefore, I respectfully ask Jeff and Dave not to release the OWASP Top
>> 10 - 2013 edition until a reasonable accompanying methodology and metrics
>> are released outlining how these risks were decided upon.
>> Is this fair? I absolutely respect the work Jeff and Dave have put into
>> this project over the years - the top 10 is a large part of what made OWASP
>> the organization it is today. I also respect their rights as the project
>> owners to do as they see fit. However, since this project is so critical
>> not only to OWASP, but to the world, I respectfully ask that:
>> - a methodology be published
>> - to have an open conversation (perhaps convening a Top 10 Summit) to
>> ensure the top 10 risks are in line with a reasonable methodology
>> Thank you,
>> Jerry Hoff
>> jerry at owasp.org
>> OWASP-Leaders mailing list
>> OWASP-Leaders at lists.owasp.org
> _______________________________________________ OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the OWASP-Leaders