[Owasp-leaders] OWASP Top 10 Summit Outcomes

Tin Zaw tin.zaw at owasp.org
Mon Jun 12 18:21:32 UTC 2017


Hi Andrew,

Thank you so much for taking over the project and running with it.

I am happy to assist as much as I can with A7. I am CC'ing *Colin* here in
case he wants to join remotely as well. It will be very early morning for
me on Tuesday in California but I will try to wake up early and join. Colin
and I are co-leaders of OWASP Automated Threats project and our perspective
is that automation (both automated defense and defense against automated
threats) should be part of the Top 10.

I also agree with applying data science to data analysis. I am not a data
scientist but I have worked with data scientists in producing Verizon Data
Breach Report (which is based on statistical analysis of collected sample
data). I am happy to share my experience -- especially normalizing
disparate sets of data -- when the time comes for next Top 10 (for 2020).

Thanks.


On Mon, Jun 12, 2017 at 7:48 AM, Eoin Keary <eoin.keary at owasp.org> wrote:

> Great stuff Andrew 👍
>
> @eoinkeary
> OWASP since 2004!!
>
> On 12 Jun 2017, at 15:41, Andrew van der Stock <vanderaj at owasp.org> wrote:
>
> Hi there,
>
> Today was a pretty good day, We got through a lot of technical discussion
> on the OWASP Top 10 2017, and documented some outcomes.
>
> *Session 1: OWASP Top 10 Governance and rationale*
>
> In this first session, we talked about the history of the OWASP Top 10,
> and how it evolved over time. In 2004, there was no data backing the
> standard. In 2007, we used only CVE data for the analysis, and we used our
> judgement to fit in CSRF as an issue. In 2010 and 2013, the forward looking
> issue was out of date components, which on one analysis of the OWASP Top 10
> to breach data represents a full 24% of all data breaches. Therefore, there
> should always be room for forward looking inclusions.
>
>
>    - Audience is everyone in appsec, and not just developers
>    - The basis for the OWASP Top 10 is "risks". I have suggested we adopt
>    the ISO 31000:2015 standard definition for risk, so that we have the same
>    understand as risk managers in most parts of the world.
>    - We will document the rationale for the OWASP Top 10 on the wiki, and
>    possibly at least a part of it in the document this year. This will be the
>    same rationale / process going forward for 2020 and 2023
>    - I will take a motion to the Board asking for a change to the Project
>    Leader Handbook, where Flagship projects will have a six month grace period
>    to obtain at least two leaders from two different firms to avoid
>    perceptions of vendor lock in either in actuality or perceived. There is no
>    restriction on a single leader from a single company controlling a Flagship
>    project at the moment, so I want to de-couple the other issues from the
>    independence issue.
>    - There will be a transparent and documented decision to ensure that
>    up to 2 of the OWASP Top 10 issues will be forward looking, and that the
>    community should drive the consensus for what they will be. I will open up
>    a discussion on OWASP Leaders and elsewhere with a short vote on what the 2
>    for 2017 should be, including the existing two issues, XXE and object
>    serialization, and I'll figure one out from the also rans of the data
>    collection process.
>
>
> *Session 2: OWASP Top 10 Data Collection Process*
>
> We talked about the way data was collected and process by which it was
> analysed. For 2017, there was an open call for data, but it wasn't widely
> reported nor pushed once open, and this might have resulted in fewer
> responses than in a perfect world. There was a lot of discussion around the
> process, if we use data scientists, can we use the existing data, and if we
> re-open the data collection. It was incredibly valuable discussion, and I
> think it struck a good pragmatic balance. We want to drive a release this
> year, but RC2 will not come out this week, so we will not be running
> editing / creating sessions this week, but will instead work on getting a
> bit more data in.
>
> The outcomes from this session are:
>
>    - A data collection process and timeline will be published to the wiki
>    to make sure everyone knows how data is collected and analysed, and when
>    the next data call will be held. Some of this will appear in the text,
>    probably an appendix to make sure that our process is transparent and open.
>    - I will work on a process with Foundation staff to ensure that we can
>    maximise publicity for the next data call round in 2019. There was
>    discussion of keeping it open all the time, but honestly, unless we can get
>    a data scientist to volunteer, I doubt we could make use of that
>    contribution. For smaller consultancies, obtaining this data is already
>    difficult, and we don't want folks to be overly burdened by the data call.
>    - A data call extension will be pushed out for interested parties. I
>    will do this tomorrow as it's quite late here already. As long as data is
>    roughly in the same Excel format as the existing call and provided by the
>    end of July, I think we can use it.
>    - Dave will reach out to Brian Glas to obtain feedback for tomorrow's
>    data weighting session to be held in the morning.
>    - For 2020, we will try to find data scientists to help us to improve
>    our data methodology and analysis, so that for the non-forward looking data
>    at least, we can ensure that data drives inclusion.
>    - Ordering will never be strictly data order; to provide continuity,
>    there is a decision (which will now be documented) that if A1 ... A3 in
>    2010 are the same in 2017 but in a slightly different order, those will
>    retain a previous order. This helps folks map results year on year and
>    prevents massive upheaval between years.
>    - Feedback obtained from the OWASP Top 10 mail list will end up in Git
>    Hub tomorrow as issues. Feedback sent privately to Dave, I will reach out
>    to these individuals to ask permission to create issues at GitHub. This
>    will help with project transparency. From now on, if you have feedback,
>    please provide it at GitHub: https://github.com/OWASP/Top10/issues
>
> This session kept on coming back to the weighting, and we looked at that
> briefly. However, we have a session tomorrow morning for that, so I would
> suggest participants look over the following blog posts before the session
> to see where we can make improvements, either this time around (and
> document it!), or if it will apply to 2020 and beyond. Thats for tomorrow.
>
>
>    - https://nvisium.com/blog/2017/04/18/musings-on-the-owasp-
>    top-10-2017-rc1/
>    - https://nvisium.com/blog/2017/04/24/musings-on-the-owasp-
>    top-10-2017-rc1-pt2/
>    - https://snyk.io/blog/owasp-top-10-breaches/
>
>
> I do want to point out that we probably should include impact as that's
> part of a traditional risk, but we also need to be fair to all data
> participants when weighting supplied data. Once we've made a decision, that
> will be documented for 2017, and we will obtain advice for 2020, 2023.
>
> Lastly, we worked on what the sessions will be. Considering the decisions
> taken in consensus, it will not be possible to release an RC2 this week,
> especially if we take more data. So we will return to looking over the A7
> on Tuesday afternoon and A10 on Wednesday morning sessions. These are new
> forward looking items, and may be with sufficient disclosure and possibly a
> bit of rewording or re-ordering, we might be able to include them. Let's
> work it out. Please participate.
>
> thanks,
> Andrew
>
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>
>
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>
>


-- 
Tin Zaw
OWASP Volunteer
Google Voice: (213) 973-9295
LinkedIn: http://www.linkedin.com/in/tinzaw
AppSec California: https://2017.appseccalifornia.org/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20170612/66b35381/attachment.html>


More information about the OWASP-Leaders mailing list