[Owasp-webcert] OWASP Evaluation and Certification Criteria

Andre Gironda andre at operations.net
Tue Jul 31 11:29:58 EDT 2007

On 7/30/07, Mark Curphey <mark at curphey.com> wrote:
> As promised today's draft. This draft contains most of Part 1 or the "soft
> concepts" around a good evaluation and certification scheme.
> Tomorrow and Weds I plan to put the meat on the technology section. I have
> already put in  a lot of thought and done quite a lot of work on this
> although that is not yet reflected in this particular draft.

Besides some of the grammatical problems (particularly on page 5), and a few
too many spaces on pages 8,10 - let me get into things I really like about
the document and where it can be improved further.

My first favorite part (besides the introduction, which is fantastic so far)
is the Basic vs. Extended Criteria.  Before I even got to this section, I
was thinking about maturity levels of organizations.

CMM has been around for quite some time, and I like how Gartner created a
similar model for Enterprises, labeling it the "Infrastructure Maturity
Model".  While the concept of your Basic vs. Extended is great - why not
apply the ITO side of the business to the IMM and the SDLC side of the
business to the CMM?  There can be maximum levels of assurance applied to
each, and each may have distinctly separate goals for PPT (People, Process,
Technology).  In summary, I think each CMM+IMM will have completely
different certification/evaluation-criteria profiles depending on the stage
that they are at with their ITO and Development levels of maturity.

Which brings me to my next point.  I'd like to see more separation here as
well.  I've always seen the assets of an organizations' technology resources
as the following things: infrastructural capital
<http://en.wikipedia.org/wiki/Infrastructural_capital>(open-source software
used + purchased software/hardware, processes used to create/run/maintain
those assets), social capital
<http://en.wikipedia.org/wiki/Social_capital>(your organization as a
team, your board, who you know, who they know, who
your partners are), individual
capital<http://en.wikipedia.org/wiki/Individual_capital>(the talent
that you directly employ, or possibly in the form of
contractors), and instructional
in place to increase individual capital).  So in a way, I
consider technology and process almost the same thing in some cases - while
the other "people" parts create/influence each other (instructional capital
increases the ability of individual capital to increase social capital).

The maturity models and capital models directly relate to many controls that
specific audits employ today.  You see companies who are
for venture capital to increase their resources.  For those
companies, SAS70 Type 1 is the normal first step.  One the QM has secured
the capital and begin to move in the direction of a Star, they realize the
need to go public - and once they do - are subject to SOX.  As Stars
burn-out they become Cash-cows, which are focused on a certain business type
that will probably be around forever in that individual vertical.  These
verticals typically have their own compliance laws and regulations such as
GLBA, HIPAA, et al.

On page 13, I first see a separate of roles/responsibilities for the meat of
the owasp certification.  You list Technology Architects, Code Reviewers,
and Pen-Testers (and also leaders in the People and Process spaces).  In a
CMM system, I would rather see these defined as threat-modelers (instead of
Architects), software testers (instead of Pen-Testers), and software
inspectors (instead of Code Reviewers).  The reasoning behind this for me is
that architects are usually focused on building a product, while
threat-modelers are focused on breaking it apart in the design phase.
Pen-testers are focused on breaking into applications after the fact.  This
is not good enough for me - I want to see continuous testing for security
during four phases: 1) unit testing, which should be continuous and inside
the IDE <http://ct-eclipse.tigris.org/>, 2) component testing, which should
be at every build - at least in mock object <http://mockobjects.com/> form
(think: DbUnit), 3) system integration testing, which provides a live
environment with no mock testing using a protocol driver for each section of
the application (e.g. JDBC spy and HTTP fault-injector) and 4) functional
testing, which includes full client-side testing using an application driver
for each client-side technology employed.  In other words, not Pen-Testers,
but QA testers.  Finally, the code reviewer should be a software inspector -
which should also be a continuous process, and in the IDE whenever possible
(e.g. JDT rules, PMD, and FindBugs).  The code inspection should also be
done manually, but I'm not sure of the benefit of manual CFA or DFA when we
do have quality tools available, even if they don't cost $2k per developer
like Fortify SCA.  Fagan
inspection<http://en.wikipedia.org/wiki/Fagan_inspection>and recursive
grep techniques based on smart cheat sheets that involve
coding errors should certainly be used.  The ability to remove code
complexity (NCSS) and report on code coverage (lots to be said here but I'm
already pushing this paragraph too far) is the primary goal for the code
inspector.  For the process person, they should be certified on the
capability to properly build a release applications through continuous
integration models that include security.

I guess what you can take out of the above paragraph, is that I don't feel
that penetration testing is a necessary part of every process.  I'd rather
see this sort of "negative testing" (although if we call this negative
testing, shouldn't it be positive testing instead - because the
threat-modeler put requirements into the development process such as
validators and other controls?) done mostly at the system integration
testing and functional testing phases of software testing.  Then, these can
be reported later during inspection (a sort of check and balance).  By doing
the testing after the inspection, you're preventing the ability to structure
release of software around properly tested code, as well as to report on the
findings involved.  There's a lot written on continuous testing and
inspection that I think even exceeds (or maybe it augments?) the work that
owasp has done so far in the guide, and that Microsoft has done in their

It appears as if you're already working on that direction as I can see on
page 16, but I thought I'd review all of the points before you fill in all
the details.  Hoping to see more out of this later, when I can give even
more feedback.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://lists.owasp.org/pipermail/owasp-webcert/attachments/20070731/27326470/attachment.html 

More information about the Owasp-webcert mailing list