[Owasp-leaders] AppSecUSA 2013 CFP Metrics

Josh Sokol josh.sokol at owasp.org
Tue Mar 12 16:46:49 UTC 2013


AppSecUSA 2012 followed a selection model kinda like this:

   - Presentation: Attack or Defense?
   - Presentation: Relevance to Industry Trends
   - Presentation: Tool Release?
   - Presentation: Vendor Neutrality
   - Speaker: Industry Recognition
   - Speaker: Presentation Skills

With the exception of "Presentation Skills" all of these are on a
three-point scale where you either have it, it's unknown, or you don't have
it.  For example, the rating for "Relevance to Industry Trends" would be 1)
Old Topic 2) Unknown or N/A and 3) New Topic.  This is meant to keep it
simple while at the same time providing a clear set of criteria for how the
speakers and presentations are evaluated.

Ultimately the decision structure came down to something like this:

   - Ridiculous/Reject immediately (30%) - Incomplete, bad or otherwise
   boring which no one will want to sit through
   - Mediocre (10%) - Just about every high level/management/APT talk ends
   up here because it's always the same and bores people to tears
   - Average (20%) - Offensive/Defensive talks with average presenters
   - Good (30%) - Offensive/Defensive talks with good presenters
   - Excellent/Must have (10%) - 0day talks, talks with tools, and
   Offensive/Defensive talks with excellent presenters

Not saying that this is the right way to do it or that your proposal is the
wrong way.  Just wanted to be able to provide the data since it seemed to
work well for us and our selection committee.

~josh

On Wed, Mar 6, 2013 at 11:23 AM, Lucas Ferreira <lucas.ferreira at owasp.org>wrote:

> Hello Israel,
>
> I would suggest a different approach: define a format for the submissions
> and let the committee decide based on the contents for all complete
> submissions.
>
> Here is a small outline of the process, which was successfully used in the
> AppSec Latam from 2009 to 2011:
>
> 1. define what a complete submission is. We required a form to be
> completed with info on the proposal, such as title, description, audience
> outline, and also info on the presenter, such as past experience, published
> materials, etc. You can add here criteria for presentation time.
> 2. You could vet submissions that are not related to the topic of the
> conference
> 3. the committee grades proposals and puts out an ordered list. The
> criteria we used are below. Several of the items in your list are good
> criteria that I would use to boost the chances of a proposal being
> selected, but which I would not use for discarding proposals.
>
> The criteria used by the committee were:
>
> 1. contents (weight 4)
> a. relevant to conference (weight 3)
>  b. innovative and original (weight 2)
> c. quality of contents and presentation (weight 1)
> 2. author (weight 3)
>  a. involvement in OWASP (weight 1)
> b. previous relevant contents in the field (weight 2)
> c. qualification of author(s) (weight 3)
> 3. Evaluation assessment (weight 3)
> a. reviewer familiarity with theme and confidence in his assessment
> (weight 3)
> b. reviewer grading (gut feeling) (weight 2)
>
> The form for submissions is available on the wiki:
> https://www.owasp.org/images/e/e4/OWASP_AppSec_Latam_2011_CFP.rtf.zip
>
> I can share the spreadsheet we used for tabulating the results if you want.
>
> Regards,
>
> Lucas
>
>
> On Wed, Mar 6, 2013 at 11:39 AM, Israel Bryski <israel.bryski at owasp.org>wrote:
>
>> Hi Leaders,
>>
>> I am managing the call for papers (CFP) and call for trainers (CFT)
>> process for the upcoming AppSecUSA in NYC (November 18, 2013). To separate
>> the real talks from the noise during the submission process, I drafted a
>> metric system to screen submissions. Talks receiving a minimum score of
>> <Insert Number> will be sent to the Speaker Selection committee for review,
>> comments and vote. The submissions that do not reach the
>> predetermined threshold will be sent back to the author with guidance on
>> how to improve their submission.
>>
>> Here is the first draft of the metrics and I welcome your feedback,
>> comments and criticism:
>>
>>
>> *
>>
>>  Criteria     Points
>>  Presentation related to an active OWASP Project or activity
>>
>> 2 Points
>> Submission includes link to recording of previous presenter performance
>>
>> 2 Points
>> Active volunteer for an OWASP Chapter, Global Committee or Project
>>
>> 1 Point
>> Presentation or training material is OWASP branded
>>
>> 1 Point
>> Presentation includes a live demo
>>
>> 1 Point
>> Defined terms or acronyms throughout presentation for the average user
>>
>> 1 Point
>> For Speakers: Presentation duration is 40-50 minutes
>>
>> 1 Point
>> For Trainers: Session will take 1 or 2 full days
>>
>> 1 Point
>> Previous Experience Training/Speaking
>>
>> 1 Point
>> Complete Abstract
>>
>> 1 Point
>> <Insert Your Metric Here>
>> *
>>
>> Thanks,
>> Israel
>>
>> _______________________________________________
>> OWASP-Leaders mailing list
>> OWASP-Leaders at lists.owasp.org
>> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>>
>>
>
>
> --
> Homo sapiens non urinat in ventum.
>
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20130312/c3594135/attachment-0001.html>


More information about the OWASP-Leaders mailing list