[Owasp-leaders] OWASP Top Ten: Project Activity?

Eoin Keary eoin.keary at owasp.org
Tue Jun 30 11:02:57 UTC 2015


Thanks Dave.
I actually thought it was Jeff :)

Looking forward to the new open model and resulting Top 10. Yes I neglected to mention IAST, good call. 
Unsure if RASP should be involved also? It's more prevention than detection after all? Thoughts?




Eoin Keary
OWASP Volunteer
@eoinkeary



> On 30 Jun 2015, at 03:38, Dave Wichers <dave.wichers at owasp.org> wrote:
> 
> Hey everyone,
> 
> I've been remiss in telling the OWASP community about the Top 10 project plans for a 2016 release. This thread has reminded me to do so (so thanks for that).
> 
> Historically, we've produced a new OWASP Top 10 every 3 years because this seems to balance the tempo of change in the AppSec market, all the work everyone does to map their tool/process/other thing to each version of the OWASP Top 10, and the effort required to produce it. We've been producing a new one every three years since 2004 (i.e., 2007/2010/2013), and so a new version for 2016 is due. (Definitely not happening in 2015).
> 
> However, we've been thinking about what might change in a 2016 release of the Top 10 and we don't actually think it would change much, if at all, which is kind of sad actually. I suspect some Top 10 items might move up or down based on the vulnerability prevalence statistics that we would need to gather and process, but I have my doubts that any new vulnerability types would break into the Top 10.
> 
> As such, given that we don't expect the list to actually change in any substantial way, the project has decided to defer the next update to a 2017 release. We will start gathering data for this release in (hopefully early) 2016 so a draft is ready by the end of the year, and then we'll have an open comment period and then an updated and final release, just like we do for every other release of the OWASP Top 10.
> 
> As I promised 2+ years ago, we are going to be much more open about the production of the 2017 OWASP Top 10 draft. We are going to an open call for data and anyone can participate. As part of this open call, we plan to define a required format for the data submission, and indicate that this data will be made public so everyone can analyze this data, not just the Top 10 project.
> 
> This data call is about vulnerability prevalence statistics. (i.e., what is found by consultants/testers, and what is found by tools/services). We'd also love to collect data about what vulnerabilities are actually being exploited and what damage is being done. Right now, in the Top 10, we list a set of factors that define the risk level of item in the Top 10, but only one is based on statistics, the others are based on our professional opinion. If anyone has any ideas on other statistics we can collect to help guide our analysis of what truly are the Top 10 Risks to Web Applications, please let the project know.
> 
> Thanks, Dave
> 
> Dave Wichers
> OWASP Top 10 Project Lead
> 
> p.s. And here are the answers to Eoin's questions:
> 
> * Who is the project lead for the top 10?  Answer: Me (Dave Wichers) - Eoin - you know that :-)
> * Can we ask other folks to supply similar data also? Answer: Of course. We are going to make a public data call.
> * Should we have a call to the leaders list?  Answer: Yes - when we are ready for it.
> 
> * Should we include both DAST and SAST metrics? I think we should. Answer: Yes of course and we have done so in the past. DAST means Dynamic Analysis (like ZAP/Burp/WebInspect) - SAST means static analysis. And we should also include runtime analysis (Gartner calls this IAST) since this new category of tools has come out.
> 
> * Metrics should be validated and verified as to remove all false positives and not skew the stats.  Answer: Not sure about this. We take results provided by vendors and consultants, and presume they have already vetted their results. We get results from LOTS of providers (more every release) to help avoid skew from bad data from one source as well as skew from what tools (in general) find vs. what different types of tools find vs. what human analysts find.
> 
> 
> 
>> On Sat, Jun 27, 2015 at 4:56 AM, Eoin Keary <eoin.keary at owasp.org> wrote:
>> Hi Timo,
>> 
>> Metrics for the top10 from us shall be cleaned and sorted :)
>> In a spreadsheet or XML or whatever you need. The same data is used For our own vulnerability stats report. 
>> 
>> Who is the project lead for the top 10?
>> Can we ask other folks to supply similar data also? 
>> Should we have a call to the leaders list?
>> Should we include both Dast and SAST metrics? I think we should.
>> Metrics should be validated and verified as to remove all false positives and not skew the stats.
>> 
>> 
>> 
>> Eoin Keary
>> OWASP Volunteer
>> @eoinkeary
>> 
>> 
>> 
>>> On 27 Jun 2015, at 09:40, Timo Goosen <timo.goosen at owasp.org> wrote:
>>> 
>>> Thanks that would be great. WIll the data need to be processed?
>>> I'm thinking we can turn this into one of the sessions at AppSec USA Project Summit.
>>> I'd be happy to lead it if I am at the summit.
>>> 
>>> 
>>> Regards.
>>> Timo
>>> 
>>>> On Fri, Jun 26, 2015 at 11:14 AM, Eoin Keary <eoin.keary at owasp.org> wrote:
>>>> We have 1000s of sanitised vulnerability data via our SaaS service which covers multiple industry verticals and tech stacks globally.
>>>> 
>>>> Both app layer CVE (known vulns) and coding issues (sqli, Xss etc etc). We have this to donate to the statistical model when required.
>>>> 
>>>> Eoin.
>>>> 
>>>> Eoin Keary
>>>> OWASP Volunteer
>>>> @eoinkeary
>>>> 
>>>> 
>>>> 
>>>>> On 26 Jun 2015, at 12:01, Timo Goosen <timo.goosen at owasp.org> wrote:
>>>>> 
>>>>> https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project
>>>>> 
>>>>> 
>>>>> This is one of the most well know OWASP projects that I can think of. The OWASP top ten only has a top ten for 2013, but not for 2013 and 2014.  This project is a flagship project, but I feel the project needs to bring out some new content considering that this is one of the most well known OWASP projects and also because the world of infosec moves really fast and two years is a life time in our field.
>>>>> 
>>>>> I don't have much say in this project but I'd like to see a Top ten for 2015, with research to back up the statistics. If the people on the project don't have time to come up with this info then I suggest we create a budget and request funding for someone to put time into this.
>>>>> 
>>>>> 
>>>>> Would like your thoughts on the matter.
>>>>> 
>>>>> Regards.
>>>>> Timo
>>>>> -- 
>>>>> You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-force+unsubscribe at owasp.org.
>>>>> To post to this group, send email to projects-task-force at owasp.org.
>>>>> To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CAMOWqYCb7MUpj%3DDO4QyAjNHQPd6ts935g44Gd3SoPNe_dPE7iw%40mail.gmail.com.
>> 
>> _______________________________________________
>> OWASP-Leaders mailing list
>> OWASP-Leaders at lists.owasp.org
>> https://lists.owasp.org/mailman/listinfo/owasp-leaders
> 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20150630/89e89d9f/attachment-0003.html>


More information about the OWASP-Leaders mailing list