[Owasp-leaders] OWASP Top 10 - Proposal for a Temporary Injunction

Jerry Hoff jerry at owasp.org
Wed Feb 27 03:31:21 UTC 2013


Hi Andrew,

To be absolutely clear - there are no accusations here, and absolutely no one is being accused of malice.

It's more of an embarrassment of riches - the Top 10 is so overwhelmingly successful and popular that it really is "OWASP" to a large number of people outside the organization.  If you go to random developers on the street and say OWASP, they will immediately say "Top 10".

Jeff and Dave have carefully groomed and grown this project into a worldwide phenomenon, catalyzing planet level change.

Obviously, with great power come great responsibility.  Organizational appsec programs are frequently based exclusively around the top 10 - I see it daily.   With this following, I believe openness on the methodology and data around the Top 10 are key.  

This is very similar to the growth cycle of many technical endeavors - imagine I start an open source framework or library.  Initially I just put something out that works.  Others start using it.  I keep adding features, and even more people start using it.  At a certain threshold, with so many people using my framework, I become responsible not only for updating my framework, but to ensure it is secure as well.  JQuery, RoR, etc… security (it could be argued) took a back seat to functionality until a certain threshold of users was crossed.

The T10 is the same way - when the first version came out, it didn't matter at all about the methodology or data.  As the project became more and more successful, and is now the basis of innumerable security programs around the globe - it acquired (willingly or unwillingly) the responsibility to become more formalized in methodology.

Is this fair?  It goes without saying Jeff and Dave have done amazing, world class work on this project. I believe however the data and methodology should be published, and the larger community should be part of the construction of the T10 moving forward.

I had the pleasure of meeting up with Jeff today at RSA and we discussed this for about an hour.  Jeff is also amenable to having a "T10 Summit" (most likely virtual).  I think Jeff and Dave allowing the community to voice their opinions on the contents of the Top 10 is a great step in line with the philosophy of OWASP.

Leaders, please voice your opinions on this matter.

Thank you,
Jerry



--
Jerry Hoff
@jerryhoff
jerry at owasp.org



On Feb 26, 2013, at 6:41 PM, vanderaj vanderaj <vanderaj at owasp.org> wrote:

> Jerry, Michael, I agree with you it should have not popped out as it did, but I wouldn't attribute malice to where someone just got 'er done. I did the same thing recently with the ASVS 2.0. I would definitely love for the Top 10 to be evidence based and have a consensus based methodology. 
> 
> (For historical purposes only and my memory is pretty shocking for the most part). For the Top 10 2007 edition, I formulated the methodology of using public statistics with Raoul Endres at a pho restaurant in Melbourne. We decided on evidence based Top 10, rather than what turned out I assumed to be pretty darned good guesses in the 2004 edition. To that end, I asked for and obtained Steve Christey's early data that is essentially the CVE stats of 2006 in electronic format. 
> 
> The main difference between the last draft I was sole author on and the final versions with everyone's input was the removal of some evidence based controls such as certain access controls, in favor of two crypto issues. I put in CSRF deliberately. I'm deliberately made sure the methodology and why CSRF was added as a choice ("Why we have added some important issues").
> 
> The document went through several private and then several public iterationsbefore setting on final drafts. I have about 10 drafts in my OWASP folder if anyone wants to see them (so far, no one has ever asked me for them). I have the raw data if anyone wants it (and no one asked for it AFAIK), assuming Steve is okay with me releasing it (you can now get that data in XML format directly from NIST). I probably should have asked Steve nicely if we could release it in some form other than a synthesis than keep it hidden. 
> 
> 
> <image.png><image.png>
> thanks,
> Andrew
> 
> 
> On Wed, Feb 27, 2013 at 9:13 AM, Dave Wichers <dave.wichers at owasp.org> wrote:
> Jerry.
> 
>  
> 
> Michael and I agree with you. We are working on it. Stay tuned. I’ve just been extremely busy at work and my personal life for the past 10 days so pretty quiet on the lists. But don’t take my silence as disagreement, but rather lack of time.
> 
>  
> 
> Jeff and I have been working on making the Top 10 more formalized and more open with each release, and we are happy to continue to make improvements in this area. Up through 2007 the project just published it and said here it is. In 2010 we opened it up to a formal open comment period for the first time. In 2013, we can do even more. Thanks for your suggestions.
> 
>  
> 
> -Dave
> 
>  
> 
> From: owasp-leaders-bounces at lists.owasp.org [mailto:owasp-leaders-bounces at lists.owasp.org] On Behalf Of Jerry Hoff
> Sent: Tuesday, February 26, 2013 4:56 PM
> To: Michael Coates
> Cc: owasp-topten-project; OWASP Leaders
> 
> 
> Subject: Re: [Owasp-leaders] OWASP Top 10 - Proposal for a Temporary Injunction
> 
>  
> 
> Hello all,
> 
>  
> 
> Considering most appsec professionals live and die by the Top 10 - getting community support behind the methodology, data, trends and statistics that should go into the Top 10 is vital.
> 
>  
> 
> Exactly as Michael says below, the mailing list is devoid of information - for something as critical to OWASP as the top 10, this seems to contradict completely the open discussion in the community. 
> 
>  
> 
> I'm looking forward to the wiki which will hopefully shed some light on the process, but I still firmly believe that there needs to be a community wide discussion on:
> 
>  
> 
>             1. The methodology (once released, a discussion around "is this the proper methodology)
> 
>             2. Data and data sources
> 
>             3. Trends
> 
>             4. A more diverse panel to finalize a true OWASP Top 10 - 2013 which reflects as accurately as possible the true ranking of current web appsec risks 
> 
>  
> 
>  
> 
> Jeff, Dave - you guys more than anyone have contributed to the overall awareness of applications security.  It is in everyone's interest - including the OWASP community and the developer community - to ensure the Top 10 is as accurate as possible.  Isn't opening the Top 10 to a more diverse set of security professionals in the interests of us all?  
> 
>  
> 
> Folks - this is important.  This is not just any OWASP project, this is the top 10.  The decisions made here, about this issue, will impact the security of people worldwide.  We should take the time and put in the effort to make sure, as a community, that we get this right! 
> 
>  
> 
> Leaders this is very important.  Please chime in and make your voice heard on this issue.
> 
>  
> 
> Jerry
> 
>  
> 
>  
> 
>  
> 
> --
> Jerry Hoff
> 
> @jerryhoff
> jerry at owasp.org
> 
> 
>  
> 
> On Feb 26, 2013, at 11:08 AM, Michael Coates <michael.coates at owasp.org> wrote:
> 
> 
> 
> 
> Ryan,
> 
> Thanks for that info. I think that helps provide some background, but Jerry's first two questions are still very relevant and the current info we publish doesn't seem to address them:
> 
> - What is the methodology used to decide on the "Top 10 risks"?
> 
> - Who exactly is involved in the selection and ordering of these risks?
> 
>  
> 
> One concern I have is that we (OWASP) don't know this information either. The place where the conversation should have happened - the top ten mailing list - does not have any information. http://lists.owasp.org/pipermail/owasp-topten/
> 
> To reiterate Jerry's questions, where was the top 10 discussed? How as it done? And who was there involved in the decision making (not just providing data)?
> 
> 
> Thanks,
> Michael
> 
>  
> 
> 
> 
> 
> --
> Michael Coates | OWASP | @_mwc
> michael-coates.blogspot.com
> 
>  
> 
> On Tue, Feb 26, 2013 at 10:29 AM, Ryan Barnett <ryan.barnett at owasp.org> wrote:
> 
> The page here (https://www.owasp.org/index.php/Top_10_2013-Introduction) lists the following -
> 
>  
> 
> "The OWASP Top 10 is based on risk data from 8 firms that specialize in application security, including 4 consulting companies and 4 tool vendors (2 static and 2 dynamic). This data spans over 500,000 vulnerabilities across hundreds of organizations and thousands of applications. The Top 10 items are selected and prioritized according to this prevalence data, in combination with consensus estimates of exploitability, detectability, and impact estimates."
> 
>  
> 
> And the Acknowlegments section on that same page lists the 8 data sources:
> 
>  
> 
> §  Aspect Security
> §  HP (Results for both Fortify and WebInspect)
> §  Minded Security
> §  Softtek
> §  TrustWave
> §  Veracode – Statistics
> §  WhiteHat Security Inc. – Statistics
>  
> 
> My concerns are not with the data sources provided, but that they are mainly factoring in only vulnerability prevalence.  We need to factor in more data sources related to attack frequency/liklihood.  Some example resources:
> 
> WASC Web Hacking Incident Database
> Akamai State of the Internet Reports
> Firehosts Web Application Attack Reports
> Imperva's Web Application Attack Reports
> This is just a few examples of real attack data that we should consider with regards to both which items are included and for priority listings.
> 
>  
> 
> -Ryan
> 
>  
> 
>  
> 
> From: Michael Coates <michael.coates at owasp.org>
> Date: Tuesday, February 26, 2013 1:05 PM
> To: owasp-topten-project <owasp-topten-project at owasp.org>
> Cc: OWASP Leaders <owasp-leaders at lists.owasp.org>
> Subject: Re: [Owasp-leaders] OWASP Top 10 - Proposal for a Temporary Injunction
> 
>  
> 
> Given the importance of the OWASP top 10, its use throughout the world, and the fact it is one of our initial points (for many organizations) of spreading the OWASP mission I think these are valid questions.
> 
> Can someone from the top 10 project provide insight on these questions? Do we have this type of information already published? If not, we should capture and publish. Other organizations will likely want to better understand both our methodology and comprehensive understanding of the landscape that led to our recommendations.
> 
> 
> Thanks,
> 
> 
> 
> 
> --
> Michael Coates | OWASP | @_mwc
> michael-coates.blogspot.com
> 
>  
> 
> On Tue, Feb 26, 2013 at 1:47 AM, Jerry Hoff <jerry at owasp.org> wrote:
> 
> 
> Hello leaders!
> 
>  
> 
> As we all know, the OWASP Top 10 - 2013 release candidate list was made available on Feb 15, 2013.  Since then, there has been a lot of controversy on a number of points.  The most pressing - in my opinion - is the methodology used to come up with the top 10.
> 
>  
> 
> The OWASP top 10 is BY FAR the most visible and recognizable project in OWASP.  It is used as a de facto standard by which countless organizations around the globe measure their application security.  The issues that are covered by the Top 10 will receive the vast majority of attention from security teams around the world - if there are redundant or needless entries, huge amounts of money will be wasted, and if important risks are not listed on the top 10, they will be largely ignored.
> 
>  
> 
> That said - I would like to raise the following issues:
> 
>  
> 
> - What is the methodology used to decide on the "Top 10 risks"?
> 
> - Who exactly is involved in the selection and ordering of these risks?
> 
> - What assurance do organizations have that this list is a true reflection of webapp-related risk
> 
> - Since this is OWASP's most visible flagship project, what assurance do WE in the OWASP community have that this list is an accurate accounting of the top 10 risks intrinsic in web apps? 
> 
>  
> 
>  
> 
> Folks - OWASP has grown extremely rapidly in a short amount of time, and as the entire industry matures doing things "like we've always done them" is not sufficient in my opinion.
> 
>  
> 
> This most popular project - that essentially speaks and represents all of OWASP - should not be selected arbitrarily and released publicly without answers to the questions above.  
> 
>  
> 
> Therefore, I respectfully ask Jeff and Dave not to release the OWASP Top 10 - 2013 edition until a reasonable accompanying methodology and metrics are released outlining how these risks were decided upon.
> 
>  
> 
> Is this fair?  I absolutely respect the work Jeff and Dave have put into this project over the years - the top 10 is a large part of what made OWASP the organization it is today.  I also respect their rights as the project owners to do as they see fit.  However, since this project is so critical not only to OWASP, but to the world, I respectfully ask that:
> 
>  
> 
> - a methodology be published 
> 
> - to have an open conversation (perhaps convening a Top 10 Summit) to ensure the top 10 risks are in line with a reasonable methodology
> 
>  
> 
> Thank you,
> 
> Jerry
> 
>  
> 
>  
> 
>  
> 
>  
> 
>  
> 
>  
> 
>  
> 
> --
> Jerry Hoff
> 
> @jerryhoff
> jerry at owasp.org
> 
> 
> 
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
> 
>  
> 
> _______________________________________________ OWASP-Leaders mailing list OWASP-Leaders at lists.owasp.org https://lists.owasp.org/mailman/listinfo/owasp-leaders
> 
>  
> 
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
> 
>  
> 
> 
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20130226/258e59dd/attachment-0001.html>


More information about the OWASP-Leaders mailing list