[Owasp-leaders] Follow-up from the Code Review Survey @ AppSec USA

Gary Robinson gary.robinson at owasp.org
Mon Oct 12 08:34:12 UTC 2015


Hi Paolo,

A few people have asked for this, so we are going to put together a blog on
the results and that can be used as a data point.  I hope to have that done
by the end of this week.  I'll send out the link.

Gary

On Mon, Oct 12, 2015 at 9:03 AM, Paolo Perego <thesp0nge at owasp.org> wrote:

> Great Gary, can I you those datas and graphics in a post on my blog?
> Have you got a link to mention as information source?
>
> Paolo
>
> On Sat, Oct 10, 2015 at 6:11 PM, Gary Robinson <gary.robinson at owasp.org>
> wrote:
>
>> Hi All,
>>
>> As some of you may know the OWASP Code Review Guide did a survey of the
>> attendees at AppSec USA.  We wanted to find out how attendees rated the
>> effectiveness of various security tools/reviews at finding issues, such as
>> business logic problems, or each of the OWASP Top 10.  Our intention was to
>> evaluate if Secure Code Review (the topic of our guide) is seen as an
>> effective security process in an organizations SDLC.  These results (below)
>> will be included in the next version of the guide.
>>
>> We want to thank all of those who took part, and communicate the results
>> of the survey (it is Security Awareness Month after all).  In the first
>> part of our survey we asked attendees to rate which of the following
>> security tools/reviews were the most effective in finding:
>>
>> 1) General security vulnerabilities
>> 2) Privacy issues
>> 3) Business logic bugs
>> 4) Compliance issues (such as HIPPA, PCI, etc.)
>> 5) Availability issues
>>
>> The results are as follows:
>>
>> [image: Inline image 1]
>>
>> Next we concentrated on the OWASP Top 10 issues, this time the results
>> were as follows:
>>
>> [image: Inline image 2]
>>
>> Please feel free to make use of this survey in whatever way you want.
>> Also feel free to discuss any of the outcomes, for example:
>>
>> a) A high percentage of people prefer manual pen testing as a way of
>> detecting availability/traffic load issues.  Is this specific to any tool,
>> or is it simply because 'load' or 'DoS' testing was not an option?
>> b) For A1, Injection, source code scanning was three times more popular
>> than manual pen testing, does that match your experience?
>> c) For A9, Using Components with Known Vulnerabilities, automated
>> vulnerability scans were far more popular than the rest.
>>
>> Just to note, this type of activity was a great outcome of the Project
>> Summit which took place before the conference.  This survey is just one of
>> the many valuable things to come from that summit.  Thanks to Larry for
>> digitizing this info.
>>
>> Best of luck,
>>
>> Gary Robinson
>> Larry Conklin
>>
>> _______________________________________________
>> OWASP-Leaders mailing list
>> OWASP-Leaders at lists.owasp.org
>> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>>
>>
>
>
> --
> "... static analysis is fun, again!"
>
> OWASP Orizon project leader, http://github.com/thesp0nge/owasp-orizon
> OWASP Esapi Ruby project leader,
> https://github.com/thesp0nge/owasp-esapi-ruby
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20151012/09c19b3f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: GeneralSecurityIssuesChart.png
Type: image/png
Size: 87964 bytes
Desc: not available
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20151012/09c19b3f/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OWASPTop10Chart.png
Type: image/png
Size: 290865 bytes
Desc: not available
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20151012/09c19b3f/attachment-0003.png>


More information about the OWASP-Leaders mailing list