[Owasp-leaders] Would the real OWASP please stand up!

Matt Tesauro mtesauro at gmail.com
Thu Sep 17 17:20:02 EDT 2009

Answers are inline below:

On Thu, 2009-09-17 at 16:40 +0100, Yiannis Pavlosoglou wrote:
> So I am sitting there coding away.. A little fuzzer, no more no less,
> 16 versions later, pet project, adding some new .NET payloads, new
> encodings, etc.
I've very aware of your 'little fuzzer' - its been in the Live CD since
my initial release.  You keep me on my toes to make sure I've got the
latest.  You've been very good about cranking new versions/improvements.
Its a good tool - don't sell yourself short.

> In the process I am wondering what happened to OWASP, how come and no
> one finding vulnerabilities in web applications, respects this
> organization anymore?
> * You turn up to any other security meeting, you don't even mention
> the acronym without getting looked badly upon
> * People actually tell me that they avoid going to particular chapter
> meetings, because they are sick and tired of presenters implicitly
> trying to sell their own company/service/tool
I'm fortunate as this hasn't been a problem at my local chapter.

> * Project leaders are thinking of pulling their projects from OWASP,
> because they are not into filling pamphlets, presentation slides and
> assessment criteria; simply they've got a new cool hack for, say, .NET
> input validation, embedded in a python script, document it and it just
> works! Did you ever see a pamphlet for apache 1.3.27?
    Please direct your flames my way.  I'm the primary author of the
Assessment Criteria v2.  Unfortunately, until we try and actually use
it, its hard to get this kind of feedback.  That said, I LOVE this
feedback.  Its exactly what is needed.
     Its new and we're still finding the right balance in what is
required and what is optional.  There are some elements of it which need
to be optional.  Here's my thoughts:
* GPC needs to to a MUCH better job distinguishing what is required for
assessments and what isn't.  You've taken the brunt of the newness of
the criteria v2.  I'll gladly fall on this sword. This is not Paulo's
fault - he is following what I wrote for v2.
* GPC needs to sort out the apps into the appropriate bin (alpha, beta,
stable).  This is well understood.  However, this isn't as easy as we
initially expected.  (more below)
* There are apps which have wanted pamplets, slide decks and the like.
If that's not your cup of tea, don't do them.  ASVS has them.  I'm
pretty sure ESAPI has them.  The Live CD doesn't.  The trick is to find
the balance between what is needed and what's the 'extra mile' that some
projects find useful.  Feedback like this helps find that line in the
sand, that balance.

> * Chapter leaders do not want to go their own folks and ask for
> donations; people that they have been together with from the beginning
> of their security careers
> And then just as I am about to give up on committees and boards and
> members and leaders, I wiz through the testing guide v_22, page 888
> and I see a true gem; I download the latest version of orizon and
> notice that workaround that would have saved me in the last web
> application assessment.
One of the end goals of the GPC is to find those gem and figure out a
way to show case them.  We're not there yet but that's our direction.
BTW, totally agree that Orizon is a OWASP Gem.  I've made a minor
contribution to that project.  I wish Paolo and the rest of the
contributors the best of luck.

> Is it too much to ask for, cutting through all of this and focusing on
> that magic phrase, web application security?
> You want a marketing department? Go hire one! The time that it takes
> me to add double encoding payloads for sharepoint into JBroFuzz is the
> time wasted on self assessment criteria. Project leader's ego aside,
> which one is better?
I agree that OWASP _needs_ proper marketing.  This is not an area were
technical security joes tend to shine.  Hiring one does take money and
somebody's time to manage that contract.  I think that is the sticking

BTW, the pre-assessment checklist was designed to only take 5 to 15
minutes.  If that part is taking longer, let me know and we can work on
making that easier.

> And whatever happened to being humble and modest if you are good at
> what you do, especially in information security.. Blow your own
> trumpet, if you've got something to say, not stale news please.
> Yes, continue to evolve and expand OWASP, do make us all proud, but
> setup some ground rules to address and harvest knowledge coming in
> from the ground. More importantly, get rid of all these silly silly
> red tape equivalents. Do not establish anything new (e.g. committees)
> without rules on how somebody will loose their status.
I'm sorry you got mired in the newness of criteria v2 - I agree bits of
it need to be optional, better communicated, automated, etc.  Step back
for a moment and consider where we started 2009.  We had a grab bag of
projects on the projects page (~120) in various stages of completion,
quality, etc.  GPC is trying to gather enough data (really project
meta-data) to intelligently sort those into meaningful categories.  

No we're definitely NOT sourceforge with popularity, number of
downloads, and all the other data point you get auto-magically with
sourceforge. But, we'll never get there if we don't start _somewhere_.
The GPC is currently in the Crawl stage of Crawl, Walk, Run.  I'd ask
you give us a bit of time to catch our stride.

I would like to point out that I'm very happy with the fact that our
initial survey of projects helped us find several which were orphaned/no
longer maintained.  Our ability to not those projects has created the
opportunity for new project leads to step up and take over those
projects.  Its a small victory, but hopefully an example of the goodness
the GPC is trying to bring to OWASP projects.

> And then comes the ultimate excuse, "it was out there for all to
> comment while we were setting up X". But how can I even comment, when
> your definition of X is ill-defined? When you didn't listen on the
> problems that its predecessor Y created. If you look at the
> power/responsibility ratio in other open source communities (say the
> linux kernel) mistakes are guaranteed not to be repeated again. Still
> in OWASP, JBroFuzz, still filling in forms, still not release quality.
> Paulo is promising that this will be the last time. What was another
> true gem that came my way, along the lines of, "we simply don't know
> what version your tool is, you need to tell us". Sincerely, if the
> about box is not enough? Go google it!
Some of this is a chicken and egg problem.  I wrote criteria v2, we got
a bit of feedback and a lot more once we started using it.  As we get
feedback, we refine the process.  I just don't know of a way to force
feedback in an volunteer organization.  Sometimes you just have to make
a judgment call, implement it, see what happens and make adjustments.
That's what we're doing.

About knowing your version number, etc.  Put yourself in Paulo's shoes
for a minute.  I bet if I found you stumbling drunk, you could still
tell me the latest version number of JBroFuzz.  Paulo has no idea.  If
he has to Google 120+ projects, he burns a lot of time not fixing other
issues for OWASP.

> It seems to me a couple of years down the line, it was the tip of the
> iceberg trying to get a simple, silly fuzzer to release quality level;
> in understanding the real OWASP and seeing how many others, globally,
> from founder equivalent level to the non-member level feel partially
> similar. Any chance of a change?
Feedback provides the mechanism for change.  

> Here are a few suggested (perhaps aggressive) paths:
> * Get the board (someone has to take the heat) to go through the tools
> one fine Saturday and decide on the release quality of each one. I'll
> buy the pizzas guys! Repeat after 3 months, assign Paulo to speak
> their voice

Perhaps there is merit in your idea of pizza + 1 Saturday.  However, I'm
not sure if the effort explaining why project X is beta and project Y is
stable after that pizza Saturday is over is less then asking project
leads get two peers to review their project.

> * Get chapter leaders to (mandatory) go through the presentation of
> any speaker and make them take out corprorate piches (even hints)
Interesting.  This is standard practice for our chapter.  Not sure how
we enforce this on volunteers though.

> * Like the HSBC adds that I see in terminal around the world, respect
> local custom and traditions in asking chapter leaders to establish a
> unified policy (especially on money matters)
This is the domain of the Chapters committee.

> * Kick the folks that don't do the work, out! Give them a second
> chance, etc. But measure on results.
Before we an measure, we need a yard stick.  As far as projects are
concerned, we're trying to make one with criteria v2.

If you don't like how we've made that metric, please continue to provide
feedback like this.

-- Matt Tesauro
OWASP Live CD Project Lead
http://AppSecLive.org - Community and Download site
> a tiny bit fed up Yiannis
> _______________________________________________
> OWASP-Leaders mailing list
> OWASP-Leaders at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-leaders

More information about the OWASP-Leaders mailing list