[Owasp-webcert] OWASP Evaluation and Certification Criteria

Mark Curphey mark at curphey.com
Tue Jul 31 11:56:24 EDT 2007



This is all great feedback. Thanks.


On most of it (and this genuinely is not a cop out) some of your comments /
thoughts will be taken care of naturally as the rest shapes out. The pen
test part will be linked to assurance levels (see Wysopal post before) and
the mantra of testing is definitely in there and not just a n other pen test
this crusade. Well of course acknowledge that some things can be found
effectively with pen testing (and even with automated), others  will clearly
require manual inspection to obtain any degree of assurance. Will try to
make sure there is an appropriate balance. 


I need to pitch this whole doc at a point where we can finish revision one
so I don't propose to detail the exact method to test for every issue, but
to show there can be levels of assurance from different techniques and
provide some cases for what they are. 


I don't disagree with your main body but I do think we must also balance
this whole project with simplicity. If we expect people to buy in to a big
picture concept around IT and then sell them on the security that goes with
it it maybe an uphill struggle. Let me chew it all over for a day or so.


Maybe one way to balance this being something people can buy into versus
"the full deal" is to focus the continuous testing in the process which is
what I think you are saying anyways. But again let me chew on it. 


Def points noted on the roles. I will make those changes before the next
draft goes out. That will be tomorrow now as if I send it out today it may
confuse people.   


Again thanks for the feedback, this is very much appreciated and will all be
taken into account / used / considered / influence the final draft.


From: andreg at gmail.com [mailto:andreg at gmail.com] On Behalf Of Andre Gironda
Sent: Tuesday, July 31, 2007 5:30 PM
To: owasp-webcert at lists.owasp.org; Mark Curphey
Subject: Re: [Owasp-webcert] OWASP Evaluation and Certification Criteria


On 7/30/07, Mark Curphey <mark at curphey.com> wrote:

As promised today's draft. This draft contains most of Part 1 or the "soft
concepts" around a good evaluation and certification scheme.

Tomorrow and Weds I plan to put the meat on the technology section. I have 
already put in  a lot of thought and done quite a lot of work on this
although that is not yet reflected in this particular draft.

Besides some of the grammatical problems (particularly on page 5), and a few
too many spaces on pages 8,10 - let me get into things I really like about
the document and where it can be improved further. 

My first favorite part (besides the introduction, which is fantastic so far)
is the Basic vs. Extended Criteria.  Before I even got to this section, I
was thinking about maturity levels of organizations.

CMM has been around for quite some time, and I like how Gartner created a
similar model for Enterprises, labeling it the "Infrastructure Maturity
Model".  While the concept of your Basic vs. Extended is great - why not
apply the ITO side of the business to the IMM and the SDLC side of the
business to the CMM?  There can be maximum levels of assurance applied to
each, and each may have distinctly separate goals for PPT (People, Process,
Technology).  In summary, I think each CMM+IMM will have completely
different certification/evaluation-criteria profiles depending on the stage
that they are at with their ITO and Development levels of maturity. 

Which brings me to my next point.  I'd like to see more separation here as
well.  I've always seen the assets of an organizations' technology resources
as the following things: infrastructural capital
<http://en.wikipedia.org/wiki/Infrastructural_capital>  (open-source
software used + purchased software/hardware, processes used to
create/run/maintain those assets), social capital
<http://en.wikipedia.org/wiki/Social_capital>  (your organization as a team,
your board, who you know, who they know, who your partners are), individual
<http://en.wikipedia.org/wiki/Individual_capital>  capital (the talent that
you directly employ, or possibly in the form of contractors), and
instructional <http://en.wikipedia.org/wiki/Instructional_capital>  capital
(processes in place to increase individual capital).  So in a way, I
consider technology and process almost the same thing in some cases - while
the other "people" parts create/influence each other (instructional capital
increases the ability of individual capital to increase social capital). 

The maturity models and capital models directly relate to many controls that
specific audits employ today.  You see companies who are Question-Marks
<http://en.wikipedia.org/wiki/BCG_growth-share_matrix>  applying for venture
capital to increase their resources.  For those companies, SAS70 Type 1 is
the normal first step.  One the QM has secured the capital and begin to move
in the direction of a Star, they realize the need to go public - and once
they do - are subject to SOX.  As Stars burn-out they become Cash-cows,
which are focused on a certain business type that will probably be around
forever in that individual vertical.  These verticals typically have their
own compliance laws and regulations such as GLBA, HIPAA, et al. 

On page 13, I first see a separate of roles/responsibilities for the meat of
the owasp certification.  You list Technology Architects, Code Reviewers,
and Pen-Testers (and also leaders in the People and Process spaces).  In a
CMM system, I would rather see these defined as threat-modelers (instead of
Architects), software testers (instead of Pen-Testers), and software
inspectors (instead of Code Reviewers).  The reasoning behind this for me is
that architects are usually focused on building a product, while
threat-modelers are focused on breaking it apart in the design phase.
Pen-testers are focused on breaking into applications after the fact.  This
is not good enough for me - I want to see continuous testing for security
during four phases: 1) unit testing, which should be continuous and inside
<http://ct-eclipse.tigris.org/>  the IDE, 2) component testing, which should
be at every build - at least in mock object <http://mockobjects.com/>  form
(think: DbUnit), 3) system integration testing, which provides a live
environment with no mock testing using a protocol driver for each section of
the application ( e.g. JDBC spy and HTTP fault-injector) and 4) functional
testing, which includes full client-side testing using an application driver
for each client-side technology employed.  In other words, not Pen-Testers,
but QA testers.  Finally, the code reviewer should be a software inspector -
which should also be a continuous process, and in the IDE whenever possible
( e.g. JDT rules, PMD, and FindBugs).  The code inspection should also be
done manually, but I'm not sure of the benefit of manual CFA or DFA when we
do have quality tools available, even if they don't cost $2k per developer
like Fortify SCA.  Fagan inspection
<http://en.wikipedia.org/wiki/Fagan_inspection>  and recursive grep
techniques based on smart cheat sheets that involve coding errors should
certainly be used.  The ability to remove code complexity (NCSS) and report
on code coverage (lots to be said here but I'm already pushing this
paragraph too far) is the primary goal for the code inspector.  For the
process person, they should be certified on the capability to properly build
a release applications through continuous integration models that include

I guess what you can take out of the above paragraph, is that I don't feel
that penetration testing is a necessary part of every process.  I'd rather
see this sort of "negative testing" (although if we call this negative
testing, shouldn't it be positive testing instead - because the
threat-modeler put requirements into the development process such as
validators and other controls?) done mostly at the system integration
testing and functional testing phases of software testing.  Then, these can
be reported later during inspection (a sort of check and balance).  By doing
the testing after the inspection, you're preventing the ability to structure
release of software around properly tested code, as well as to report on the
findings involved.  There's a lot written on continuous testing and
inspection that I think even exceeds (or maybe it augments?) the work that
owasp has done so far in the guide, and that Microsoft has done in their

It appears as if you're already working on that direction as I can see on
page 16, but I thought I'd review all of the points before you fill in all
the details.  Hoping to see more out of this later, when I can give even
more feedback. 


-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://lists.owasp.org/pipermail/owasp-webcert/attachments/20070731/0ae1234e/attachment-0001.html 

More information about the Owasp-webcert mailing list