[Owasp-leaders] OWASP Top 10 - Proposal for a Temporary Injunction

Dave Wichers dave.wichers at owasp.org
Tue Feb 26 22:13:40 UTC 2013


Jerry.

 

Michael and I agree with you. We are working on it. Stay tuned. I’ve just
been extremely busy at work and my personal life for the past 10 days so
pretty quiet on the lists. But don’t take my silence as disagreement, but
rather lack of time.

 

Jeff and I have been working on making the Top 10 more formalized and more
open with each release, and we are happy to continue to make improvements in
this area. Up through 2007 the project just published it and said here it
is. In 2010 we opened it up to a formal open comment period for the first
time. In 2013, we can do even more. Thanks for your suggestions.

 

-Dave

 

From: owasp-leaders-bounces at lists.owasp.org
[mailto:owasp-leaders-bounces at lists.owasp.org] On Behalf Of Jerry Hoff
Sent: Tuesday, February 26, 2013 4:56 PM
To: Michael Coates
Cc: owasp-topten-project; OWASP Leaders
Subject: Re: [Owasp-leaders] OWASP Top 10 - Proposal for a Temporary
Injunction

 

Hello all,

 

Considering most appsec professionals live and die by the Top 10 - getting
community support behind the methodology, data, trends and statistics that
should go into the Top 10 is vital.

 

Exactly as Michael says below, the mailing list is devoid of information -
for something as critical to OWASP as the top 10, this seems to contradict
completely the open discussion in the community. 

 

I'm looking forward to the wiki which will hopefully shed some light on the
process, but I still firmly believe that there needs to be a community wide
discussion on:

 

            1. The methodology (once released, a discussion around "is this
the proper methodology)

            2. Data and data sources

            3. Trends

            4. A more diverse panel to finalize a true OWASP Top 10 - 2013
which reflects as accurately as possible the true ranking of current web
appsec risks 

 

 

Jeff, Dave - you guys more than anyone have contributed to the overall
awareness of applications security.  It is in everyone's interest -
including the OWASP community and the developer community - to ensure the
Top 10 is as accurate as possible.  Isn't opening the Top 10 to a more
diverse set of security professionals in the interests of us all?  

 

Folks - this is important.  This is not just any OWASP project, this is the
top 10.  The decisions made here, about this issue, will impact the security
of people worldwide.  We should take the time and put in the effort to make
sure, as a community, that we get this right! 

 

Leaders this is very important.  Please chime in and make your voice heard
on this issue.

 

Jerry

 

 

 

--
Jerry Hoff

@jerryhoff
jerry at owasp.org



 

On Feb 26, 2013, at 11:08 AM, Michael Coates <michael.coates at owasp.org>
wrote:





Ryan,

Thanks for that info. I think that helps provide some background, but
Jerry's first two questions are still very relevant and the current info we
publish doesn't seem to address them:

- What is the methodology used to decide on the "Top 10 risks"?

- Who exactly is involved in the selection and ordering of these risks?

 

One concern I have is that we (OWASP) don't know this information either.
The place where the conversation should have happened - the top ten mailing
list - does not have any information.
http://lists.owasp.org/pipermail/owasp-topten/

To reiterate Jerry's questions, where was the top 10 discussed? How as it
done? And who was there involved in the decision making (not just providing
data)?



Thanks,
Michael

 





--
Michael Coates | OWASP | @_mwc
michael-coates.blogspot.com <http://michael-coates.blogspot.com/> 

 

On Tue, Feb 26, 2013 at 10:29 AM, Ryan Barnett <ryan.barnett at owasp.org>
wrote:

The page here (https://www.owasp.org/index.php/Top_10_2013-Introduction)
lists the following -

 

"The OWASP Top 10 is based on risk data from 8 firms that specialize in
application security, including 4 consulting companies and 4 tool vendors (2
static and 2 dynamic). This data spans over 500,000 vulnerabilities across
hundreds of organizations and thousands of applications. The Top 10 items
are selected and prioritized according to this prevalence data, in
combination with consensus estimates of exploitability, detectability, and
impact estimates."

 

And the Acknowlegments section on that same page lists the 8 data sources:

 

§  Aspect Security

§  HP (Results for both Fortify and WebInspect)

§  Minded Security

§  Softtek

§  TrustWave

§  Veracode – Statistics

§  WhiteHat Security Inc. – Statistics

 

My concerns are not with the data sources provided, but that they are mainly
factoring in only vulnerability prevalence.  We need to factor in more data
sources related to attack frequency/liklihood.  Some example resources:

*	WASC Web Hacking Incident Database
*	Akamai State of the Internet Reports
*	Firehosts Web Application Attack Reports
*	Imperva's Web Application Attack Reports

This is just a few examples of real attack data that we should consider with
regards to both which items are included and for priority listings.

 

-Ryan

 

 

From: Michael Coates <michael.coates at owasp.org>
Date: Tuesday, February 26, 2013 1:05 PM
To: owasp-topten-project <owasp-topten-project at owasp.org>
Cc: OWASP Leaders <owasp-leaders at lists.owasp.org>
Subject: Re: [Owasp-leaders] OWASP Top 10 - Proposal for a Temporary
Injunction

 

Given the importance of the OWASP top 10, its use throughout the world, and
the fact it is one of our initial points (for many organizations) of
spreading the OWASP mission I think these are valid questions.

Can someone from the top 10 project provide insight on these questions? Do
we have this type of information already published? If not, we should
capture and publish. Other organizations will likely want to better
understand both our methodology and comprehensive understanding of the
landscape that led to our recommendations.


Thanks,





--
Michael Coates | OWASP | @_mwc
michael-coates.blogspot.com <http://michael-coates.blogspot.com/> 

 

On Tue, Feb 26, 2013 at 1:47 AM, Jerry Hoff <jerry at owasp.org> wrote:



Hello leaders!

 

As we all know, the OWASP Top 10 - 2013 release candidate list was made
available on Feb 15, 2013.  Since then, there has been a lot of controversy
on a number of points.  The most pressing - in my opinion - is the
methodology used to come up with the top 10.

 

The OWASP top 10 is BY FAR the most visible and recognizable project in
OWASP.  It is used as a de facto standard by which countless organizations
around the globe measure their application security.  The issues that are
covered by the Top 10 will receive the vast majority of attention from
security teams around the world - if there are redundant or needless
entries, huge amounts of money will be wasted, and if important risks are
not listed on the top 10, they will be largely ignored.

 

That said - I would like to raise the following issues:

 

- What is the methodology used to decide on the "Top 10 risks"?

- Who exactly is involved in the selection and ordering of these risks?

- What assurance do organizations have that this list is a true reflection
of webapp-related risk

- Since this is OWASP's most visible flagship project, what assurance do WE
in the OWASP community have that this list is an accurate accounting of the
top 10 risks intrinsic in web apps? 

 

 

Folks - OWASP has grown extremely rapidly in a short amount of time, and as
the entire industry matures doing things "like we've always done them" is
not sufficient in my opinion.

 

This most popular project - that essentially speaks and represents all of
OWASP - should not be selected arbitrarily and released publicly without
answers to the questions above.  

 

Therefore, I respectfully ask Jeff and Dave not to release the OWASP Top 10
- 2013 edition until a reasonable accompanying methodology and metrics are
released outlining how these risks were decided upon.

 

Is this fair?  I absolutely respect the work Jeff and Dave have put into
this project over the years - the top 10 is a large part of what made OWASP
the organization it is today.  I also respect their rights as the project
owners to do as they see fit.  However, since this project is so critical
not only to OWASP, but to the world, I respectfully ask that:

 

- a methodology be published 

- to have an open conversation (perhaps convening a Top 10 Summit) to ensure
the top 10 risks are in line with a reasonable methodology

 

Thank you,

Jerry

 

 

 

 

 

 

 

--
Jerry Hoff

@jerryhoff
jerry at owasp.org




_______________________________________________
OWASP-Leaders mailing list
OWASP-Leaders at lists.owasp.org
https://lists.owasp.org/mailman/listinfo/owasp-leaders

 

_______________________________________________ OWASP-Leaders mailing list
OWASP-Leaders at lists.owasp.org
https://lists.owasp.org/mailman/listinfo/owasp-leaders 

 

_______________________________________________
OWASP-Leaders mailing list
OWASP-Leaders at lists.owasp.org
https://lists.owasp.org/mailman/listinfo/owasp-leaders

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20130226/3efd72e2/attachment-0001.html>


More information about the OWASP-Leaders mailing list