[Owasp-webcert] Assurance Levels

Mark Curphey mark.curphey at sourceclear.com
Tue Jun 12 12:11:15 EDT 2007


Thanks Chris, Jeff and Ed. I have read all the responses and am digesting them slowly.
 
I think its fair to say the following;
 
1. Everyone agrees that the concept of assurance levels will be a powerful tool
2. Mapping tools, techniques etc will be hard
 
Note: the criteria will extend far beyond the OWASP Top 10 which is just an awareness docment
 
I am in London now but let me send out a strawman of how this could work by mid next week and we can delve into the details. 
 
I suspect that as with building software, different ways of finding vulnerabilities may be cheaper, faster and better but probably not all three at once !

________________________________

From: Bellis, Ed [mailto:Ed.Bellis at orbitz.com]
Sent: Tue 6/12/2007 8:44 AM
To: Chris Wysopal; Mark Curphey; owasp-webcert at lists.owasp.org
Subject: RE: [Owasp-webcert] Assurance Levels



 

I think this makes a lot of sense and should likely be the way we look at incorporating assurance levels into the webcert standard. As discussed previously, this addresses the "bubble" thinking of the current PCI standard. By taking into account the vulnerability class and the threat that is being mitigated, you'll be able to pick the "right control for the job" to achieve the assurance level required for the application and the organization.

 

This is great work, and of course, the devil will always be in the details. Is there an existing matrix that maps vulnerability classes to analysis techniques (OWASP top 10 at a minimum)? I presume this would be something this group will need to come up with. Chris, do you have a head start in this area?

 

-Ed

 

 

________________________________

From: owasp-webcert-bounces at lists.owasp.org [mailto:owasp-webcert-bounces at lists.owasp.org] On Behalf Of Chris Wysopal
Sent: Monday, June 11, 2007 9:55 AM
To: Mark Curphey; owasp-webcert at lists.owasp.org
Subject: Re: [Owasp-webcert] Assurance Levels

 

 

I should have supplied more explanation behind the graphic that I sent to Mark because it is a bit more complex than just automated static is better than a questionaire and automated dynamic is better than automated static and so on.  Each analysis technique has a subset of all vulnerability classes that it can even attempt to detect.  Try finding authorization bypass with an automated tool for instance. More generally, each analysis technique has a false negative rate for each vulnerability class.  Measuring the false negative rates by vulnerability class for each analysis technique is a significant effort but a valuable one.  At Veracode we are "analysis technique neutral".  We want to combine multiple techniques to give the best possible analysis for the amount of money appropriate for the assurance requirements. Security must make economic sense after all.  

 

So back to the complexity.  The idea behind measuring the capabilities of different analysis techniques is to assure that the false negative rate for all the vulnerability classes you care about is low enough for the assurance level the application requires.  So for example lets take the OWASP top ten as the vulnerability classes we care about for a certain application.  If the application is high assurance such as an online banking application we have to make sure the false negative rate for all these vulnerability classes is close to zero.  A fully manual effort is not the most cost effective since manual testing is the most expensive.  If we combine automated static and automated dynamic and then add manual testing for the classes that the first 2 techniques can't detect or have unacceptable false negative rates then we can get to an acceptable FN rate for the complete OWASP top ten. 

 

There are certainly things that automated static analysis can find better than manual pen testing.  Integer overflows in C/C++ application is a good example. So the choice of testing technique does depend on the vulnerability class you are looking for.  It seems to me that manual pen testing is the best for many of the OWASP top ten but automated dynamic and even automated static do have a place in driving down costs for high assurance applications.  The automated techniques may also be good enough for medium assurance applications such as back office application that don't deal with high value information.

 

-Chris

 

________________________________

From: owasp-webcert-bounces at lists.owasp.org [mailto:owasp-webcert-bounces at lists.owasp.org] On Behalf Of Mark Curphey
Sent: Monday, June 11, 2007 4:27 AM
To: owasp-webcert at lists.owasp.org
Subject: [Owasp-webcert] Assurance Levels

I propose to make assurance levels an integral part of the OWASP Web Certification Criteria and want your feedback on the concept.

In many ways it's one of those things that's so damn obvious when you see it described with clarity. Enlightenment came for me when Chris Wysopal <http://www.veracode.com/management-team.php#Wysopal>  sent me the fantastic graphic atttached describing Veracodes <http://www.veracode.com/>  view of assurance levels.  Of course it is nothing new, the basic concept of assurance (confidence) is as follows;

	Different testing techniques provide different levels of assurance (confidence) on claims about the security of a web site. 

An automated static analysis tool will provide a lower level of assurance than an automated dynamic analysis tool which will in-turn provide a lower level of assurance than a comprehensive manual code review.  It also follows that an automated web application penetration test will provide a lower level of assurance than a manual penetration test. Both types of penetration testing will provide lower levels of assurance than code reviews. It also makes sense that if a company has demonstrated that security is an integral part of the security DNA of their SDLC (define, design, develop, deploy and maintain) then there is a higher level of assurance that any test results will be consistent in the future. 

So why wouldn't everyone just go for the approach that provides the highest level of assurance? It's very simple, cost. The appropriate level of assurance should be based on risk. 

Of course all of this things have a butterfly effect, no two tools are the same and no two testers are the same. Imagine a control panel with multiple dials but where people want a single output display (not necessarily a single reading).  I expect lots of people arguing that a specific tool or firm is as good as the next level up on the assurance level but well deal with that as well.

This also enables us to define what a web app firewall is good for, what it isn't and place it into an assurance level bucket. More on that in a while. 

By incorporating assurance levels into the criteria, industry sectors, business partners or regulators can require a level of security with an assurance level based on risk. This would be a significant step forward from where we are today with broken schemes like PCI DSS <http://securitybuddha.com/2007/03/23/the-problems-with-the-pci-data-security-standard-part-1/> .

So what do y'all think?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://lists.owasp.org/pipermail/owasp-webcert/attachments/20070612/d959d4f8/attachment.html 


More information about the Owasp-webcert mailing list