[Owasp-topten] OWASP Top 10 2017 Project Update

Eric Mann eric at tozny.com
Wed Oct 11 03:26:20 UTC 2017


Just out of curiosity, is the RC2 draft under way? The preliminary schedule
listed its release as yesterday, but with no updates here I'm at a loss as
to where to turn for more information. Likewise the GitHub issues list
doesn't reveal any further information.

Can we get a quick update?

On Fri, Aug 4, 2017 at 12:44 PM, Neil Smithline <neil.smithline at owasp.org>
wrote:

> The following is from https://owasp.blogspot.com/
> 2017/08/owasp-top-10-2017-project-update.html.
>
> The OWASP Top 10 is the most heavily referenced, most heavily used, and
> most heavily downloaded document at OWASP. Therefore, it rightfully has a
> greater level of scrutiny and a greater level of review as befitting a
> Flagship project.
>
>
> The previous Top 10 leaders have passed the baton for this project on to a
> new team and we will strive to address the feedback that has been provided
> over the past few months. We have discussed as a team and at the OWASP
> Summit what steps must be taken and what changes must be made to the OWASP
> Top 10.
>
>
> A summary of changes is listed below, please read further to understand
> more of the why behind them:
>
>    - The Top 10 will focus on Vulnerability Categories.
>    - Feedback on the mailing list has been moved to the Issues List (
>    https://github.com/OWASP/Top10/issues
>    <https://github.com/OWASP/Top10/issues>) in GitHub, please continue to
>    contribute feedback there. <https://github.com/OWASP/Top10/issues>
>    - The content of the document will be extracted to provide easier
>    translations.
>    - Scoring for Top 10 entries is intended to be based on Common
>    Weakness Scoring System (CWSS)
>    - For the 2017 Edition, 8 of 10 vulnerabilities will be selected from
>    data submitted via the call for data and 2 of 10 will be selected from an
>    industry-ranked survey.
>    - A ranked survey (https://goo.gl/forms/ltbKrdYrp4Qdl7Df2) is now
>    available for industry professionals to select two new vulnerability
>    categories for inclusion in the Top 10 2017. The deadline for the survey is
>    30 August, 2017.
>    - The call for data (https://goo.gl/forms/tLgyvK9O74r7wMkt2) is now
>    reopened to allow for additional data to be collected for analysis. The new
>    deadline for the extended data call is 18 September, 2017.
>    - The Top 10 2017 RC2 will released for review and feedback 9 October,
>    2017.
>    - The final release of the Top 10 2017 is targeted for 18 November,
>    2017.
>
>
>
> [image: OWASP Top10 Timeline-v2.png]
>
>
> The OWASP Top 10 has always been about missing controls, flawed controls,
> or working controls that haven’t been used, which when present are commonly
> called vulnerabilities. We have traditionally linked the OWASP Top 10 into
> the Common Weakness Enumeration (CWE) list maintained by NIST / MITRE. We
> will continue to align with CWEs and utilize the CWSS scoring system to
> help provide an industry standard measurement.
> For the Top 10 2017, we will be focusing on vulnerability categories.
> These categories will be mapped to one or more CWEs where possible. The
> scoring system for the Top 10 will be updated to leverage the CWSS as much
> as feasible. Like the Common Vulnerability Scoring System (CVSS) for
> specific Common Vulnerabilities & Exposures (CVEs), we are intending to use
> CWSS for vulnerability categories. In the scenario where there are multiple
> CWEs, we will use the high-water mark; if there is a vulnerability category
> without a matching CWE, we will do what we can to align a CWSS score.
> Although the OWASP Top 10 is partially data-driven, there is also a need
> to be forward looking. At the OWASP Summit we agreed that for the 2017
> Edition, eight of the Top 10 will be data-driven from the public call for
> data and two of the Top 10 will be forward looking and driven from a survey
> of industry professionals. The OWASP Top 10 will clearly identify which
> items are forward looking: we will use the CWSS score of these items (if a
> CWE for the issue exists) or our best judgement on where the issue will be
> ranked in the Top 10.
>
>
> The extended call for data can be accessed here: https://goo.gl/forms/
> tLgyvK9O74r7wMkt2
> The two items that are not data-driven will be supported by a qualitative
> survey. The survey is comprised of vulnerability categories that were
> identified as “on the cusp,” mailing list feedback, and previous call for
> data feedback. Respondents should rank the top four most important
> vulnerability categories from their knowledge and experience. The two
> vulnerability categories with the total highest ranking will be included in
> the Top 10 2017. The information will also help us develop a plan to better
> structure the call for data for the OWASP Top 10 2020.
>
>
> The survey can be accessed here: https://goo.gl/forms/ltbKrdYrp4Qdl7Df2
>
>
> Every single issue in the OWASP Top 10 should have a direct cause: either
> one or more missing or ineffective controls, or not using an in place
> control. Every issue should be precise in its language and view (as in not
> intermingling the terms “weakness,” “vulnerability,” “threat,” “risk,” or
> “defect”) so each issue can be theoretically testable. This will help us
> make a stronger and more defensible list of included items.
> We aim to review and resolve ontological concerns, such as including
> issues that are not like the others. This means that in some circumstances,
> there should be a view from the Developer perspective (documented by the
> OWASP Proactive Controls) and a view for the Defending Blue Team
> (documented by the currently non-existent OWASP Defensive Controls).
> Every issue should contain clear and effective advice on remediation,
> deterrence, delay and detection that can be adopted by any development team
> - no matter how small or how large. As the OWASP Top 10 are important
> vulnerability categories, we should strive to make our advice easy to
> follow and easily translatable into other languages.
> From a methodology point of view, we are looking at taking lessons learned
> from 2017 and coming up with a better process for the OWASP Top 10 in 2020.
> We would like to coordinate with other teams to provide a staggered release
> of the other OWASP Top 10 efforts with sufficient time between each release
> to allow the industry to upgrade and adopt in a practical way.
> Lastly, we are opening up the text to provide history and traceability. We
> need to ensure that all of the issues documented within any of the various
> Flagship projects, but particularly the OWASP Top 10, can be satisfied by
> developers and devops engineers without recourse to paid tools or services.
> There is value in the use of paid services and tools, but as an open (as in
> free and in liberty) organization, the OWASP Top 10 should have a low
> barrier of entry, and high effectiveness of any suggested remediations.
> Thank you, and we look forward to working with you on the OWASP Top 10.
>
>
> OWASP Top 10 Project Leaders
> Andrew van der Stock
> Neil Smithline
> Torsten Gigler
> Data Analyst
>
> Brian Glas
>
> --
> Neil Smithline
> OWASP Top-10 Co-Leader
> @neil_smithline <https://twitter.com/neil_smithline>
>
>
> _______________________________________________
> Owasp-topten mailing list
> Owasp-topten at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-topten
>
>


-- 
*Eric Mann **·* Tozny, LLC <https://tozny.com/> *·* 971.238.0971 *·*
@ericmann <https://twitter.com/ericmann>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-topten/attachments/20171010/d5ba32a8/attachment-0001.html>


More information about the Owasp-topten mailing list