[Owasp-leaders] AppSecUSA 2013 CFP Metrics

Lucas Ferreira lucas.ferreira at owasp.org
Wed Mar 6 17:23:57 UTC 2013


Hello Israel,

I would suggest a different approach: define a format for the submissions
and let the committee decide based on the contents for all complete
submissions.

Here is a small outline of the process, which was successfully used in the
AppSec Latam from 2009 to 2011:

1. define what a complete submission is. We required a form to be completed
with info on the proposal, such as title, description, audience  outline,
and also info on the presenter, such as past experience, published
materials, etc. You can add here criteria for presentation time.
2. You could vet submissions that are not related to the topic of the
conference
3. the committee grades proposals and puts out an ordered list. The
criteria we used are below. Several of the items in your list are good
criteria that I would use to boost the chances of a proposal being
selected, but which I would not use for discarding proposals.

The criteria used by the committee were:

1. contents (weight 4)
a. relevant to conference (weight 3)
 b. innovative and original (weight 2)
c. quality of contents and presentation (weight 1)
2. author (weight 3)
 a. involvement in OWASP (weight 1)
b. previous relevant contents in the field (weight 2)
c. qualification of author(s) (weight 3)
3. Evaluation assessment (weight 3)
a. reviewer familiarity with theme and confidence in his assessment (weight
3)
b. reviewer grading (gut feeling) (weight 2)

The form for submissions is available on the wiki:
https://www.owasp.org/images/e/e4/OWASP_AppSec_Latam_2011_CFP.rtf.zip

I can share the spreadsheet we used for tabulating the results if you want.

Regards,

Lucas


On Wed, Mar 6, 2013 at 11:39 AM, Israel Bryski <israel.bryski at owasp.org>wrote:

> Hi Leaders,
>
> I am managing the call for papers (CFP) and call for trainers (CFT)
> process for the upcoming AppSecUSA in NYC (November 18, 2013). To separate
> the real talks from the noise during the submission process, I drafted a
> metric system to screen submissions. Talks receiving a minimum score of
> <Insert Number> will be sent to the Speaker Selection committee for review,
> comments and vote. The submissions that do not reach the
> predetermined threshold will be sent back to the author with guidance on
> how to improve their submission.
>
> Here is the first draft of the metrics and I welcome your feedback,
> comments and criticism:
>
>
> *
>
>  Criteria     Points
>  Presentation related to an active OWASP Project or activity
>
> 2 Points
> Submission includes link to recording of previous presenter performance
>
> 2 Points
> Active volunteer for an OWASP Chapter, Global Committee or Project
>
> 1 Point
> Presentation or training material is OWASP branded
>
> 1 Point
> Presentation includes a live demo
>
> 1 Point
> Defined terms or acronyms throughout presentation for the average user
>
> 1 Point
> For Speakers: Presentation duration is 40-50 minutes
>
> 1 Point
> For Trainers: Session will take 1 or 2 full days
>
> 1 Point
> Previous Experience Training/Speaking
>
> 1 Point
> Complete Abstract
>
> 1 Point
> <Insert Your Metric Here>
> *
>
> Thanks,
> Israel
>
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>
>


-- 
Homo sapiens non urinat in ventum.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20130306/aabccc6e/attachment.html>


More information about the OWASP-Leaders mailing list