[OWASP-TESTING] Ideas for WAVA Testing Overview & Procedures Checklist documents

Stephen Venter stephen.venter at gmail.com
Mon Apr 18 16:32:39 EDT 2005


Hi Jeff

I don't mean my ideas to be taken as prescriptive in terms of which
testing approaches or tools you choose, or how you choose to perform
or use them.

I fully agree with you. When it comes down to it, it is our duty as
professionals in the field to choose the most efficient and effective
techniques and technologies to achieve those goals.

I certainly would never encourage anyone to believe that there's only
one way to tackle the problem of trying to identify and eradicate
vulnerabilities.

My main aim is to get people to think of web application security more
as a "three dimensional" problem, instead of just a "two dimensional"
one.

Perhaps the use of any term containing the word "box" in it is NOT
even appropriate here for the WAVA. Perhaps my ideas would be better
summarised as:

•	The perspectives you SHOULD consider adopting when you plan to do a
vulnerability assessment:
o	Unauthorised, anonymous user
o	Authorised normal user
o	Auditor / Super user  - Full access to all source code, systems,
documentation, policies, procedures, etc

•	And the security assessment phases to be applied to whichever (or
ALL) of those approaches you adopt in testing your system:

o	Planning
o	Information gathering
o	Target identification
o	Service enumeration & service information mining
o	Automated testing procedures
o	Manual testing procedures
o	Feedback / report writing
o	(Others? Different wording for these?)

None of this should be performed in isolation.  Information gained
from one testing approach / perspective SHOULD feed into the other
perspectives to ensure that all the angles are covered.

Regards
Steve

On 4/18/05, Jeff Williams <jeff.williams at owasp.org> wrote:
> I like the concept of a WAVA, as it does get away from the techniques that
> get used.  But I don't want to encourage anyone to think that a black box
> penetration test is a good way to get secure.  You don't learn anything
> useful from this kind of security testing.  And I don't think it's a good
> idea to promote the idea that you go from black-box to white-box.
> 
> I'd like to see us go in a direction where the testing guide encourages the
> use of the most cost-effective technique for detecting each type of problem
> (taking into account the particulars of the application being analyzed).
> 
> So, for example, if I think the most cost-effective way to find SQL
> injection in a particular application to use static analysis, I break out my
> favorite tool and find all the database accesses.  If there's one that looks
> like a hole, I might use WebScarab to demonstrate it to the customer (lots
> of value in this).  On that same WAVA, I might also use scanners and some
> manual code review to find problems.  I use the most cost-effective
> technique where it makes sense for the current application.
> 
> That's what makes sense to me.
> 
> --Jeff
> 
> ----- Original Message -----
> From: "Stephen Venter" <stephen.venter at gmail.com>
> To: <owasp-testing at lists.sourceforge.net>
> Sent: Monday, April 18, 2005 1:11 PM
> Subject: [OWASP-TESTING] Ideas for WAVA Testing Overview & Procedures
> Checklist documents
> 
> > Hi all
> >
> > I'd like to I put forward another concept for consideration here. As
> > most people have been mentioning, there are two different perspectives
> > that you (as the application tester) normally adopt when you perform
> > your testing activities against the target system: black-box and
> > white-box.
> >
> > However, I feel there is a need to consider that perhaps these
> > perspectives (coined when security testing was all about
> > infrastructure reviewing / pentesting) don't adequately reflect /
> > encompass the approaches adopted during Web Application security
> > reviewing (and here's another phrase that a previous colleague of mine
> > came up with which I feel might be more appropriate, to help get away
> > from the word "pentest": Web Application Vulnerability Assessment, or
> > WAVA).
> >
> > My suggestion is that we consider defining an additional perspective:
> > "Translucent box" [or whatever name might be more appropriate here – I
> > was just trying to find one that sort of conveyed the concept of
> > fitting in-between black & white]), and that some or all of the actual
> > testing techniques, might be appropriate (and some not) depending on
> > which point of view you are approaching the application from – and off
> > course, some testing procedures may need to be performed differently
> > depending on this perspective.
> >
> > These testing perspective definitions would be best suited to being
> > inclusions into the OWASPTesting_PhaseOne document (although I would
> > like to offer an alternative name for that too: something like the
> > "Testing Overview" or "approach" or "guideline document", because, as
> > far as I see it, it actually covers a number of phases not just one),
> > and then the Web App Testing Checklist table could be expanded to
> > include a column for specifics about the testing phases from the
> > Overview document where the tests are appropriate, or should be
> > customised, etc And then the appendices to the checklist could be the
> > place to include specific examples of how to carry out those tests and
> > what the expected results are, etc.
> >
> > I feel it would be appropriate to define the testing perspectives
> > something like:
> > 1. Black box - from the perspective of an anonymous, unauthenticated
> > user, with minimal knowledge about the target system
> > 2. Translucent box [or whatever name might be more appropriate here –
> > I was just trying to find one that sort of conveyed the concept of
> > fitting in-between black & white] – from the perspective of a "normal"
> > application user, where they have a login account (through whatever
> > means a "normal" user of the system gets assigned an "authorised"
> > login to it)
> > 3. White box – from the perspective of a person with full system
> > access, and thus full access to the application code and servers, etc
> > [typically the level of access the system admin, application author,
> > system auditor might have]
> >
> > And then you can discuss how each of the testing procedures /
> > techniques / steps fit in with each of the above perspectives,
> > together with how they might or might not be appropriate in the
> > different testing perspectives, or might need to be modified to suit
> > the particular perspective.
> >
> > In each perspective there are a certain amount of activities to be
> > performed that fall into the common class of categories like:
> > · Planning
> > · Information gathering
> > · Target identification
> > · Service enumeration & service information mining
> > · Automated testing procedures
> > · Manual testing procedures
> > · Feedback / report writing
> >
> > So, for example, in the traditional black box pentest, the pentester
> > might just be given a server name or IP address as the target (i.e.
> > the anonymous user approach, with minimal information up front), and
> > he will be required to go through the usual testing steps like:
> > · Planning: understanding the security significance of the target and
> > why someone might want to compromise it;
> > · Information gathering: querying public resources, whois records,
> > mailing lists, for information that may help, etc;
> > · Target identification: ping sweeps, icmp vs tcp, etc
> > · Service enumeration (portscans) & service information mining
> > (fingerprinting, harvesting service version info from what is returned
> > when you connect to them, etc)
> > · Automated testing using vulnerability scanners
> > · Manual testing procedures (weed out false-positives, perform
> > additional  checks to compliment the automated scanners, etc)
> > · Feedback / report writing
> >
> > In an application security vulnerability assessment (WAVA) scenario,
> > the black box approach could be where the tester has been given the
> > specific URL or web server address only. Again the testing techniques
> > / steps / procedures might include:
> > · Planning: understanding the security significance of the target and
> > why someone might want to compromise it;
> > · Information gathering: querying public resources, connecting to the
> > server with a browser and reading what information is presented, etc;
> > · Target identification: HTTP vs HTTPS ports, SSL certificate
> > information, a lot of the work here could be considered more
> > traditional infrastructure pentest work, and sometimes we need to be
> > wary of being drawn too much into this when the primary target is the
> > web app, not necessarily the infrastructure [and yes, of course the
> > infrastructure needs to be secure, which is what pure pentests are
> > there for…]
> > · Service enumeration & service information mining – observing what
> > web application specific information is obtainable, noting the input
> > fields and variables, including hidden fields, client side scripting,
> > etc.
> > · Manual testing procedures – I'd say that an application tester is
> > likely to start with manual testing, and only get on to automated
> > testing when he has a better understanding of what functions are
> > performed by the application - testing areas like user authentication
> > processes (session manipulation / user escalation), input validation,
> > code injection, etc.
> > · Automated testing – like sampling session management values (e.g.
> > cookie) to assess for predictability, brute force password guessing,
> > code injection testing of input fields (i.e. SQL injection, etc)
> > · Feedback / report writing
> >
> > In a pentesting situation, translucent box testing might be testing
> > from the position of the DMZ or corporate network, where you have
> > access behind firewalls, etc. Not quite administrative access, but
> > closer to the level of access that a normal (a.k.a. "corporate") user
> > might have.
> >
> > With the application (WAVA) tester, the translucent box testing can be
> > seen to be where the tester is given example user accounts for users
> > of different levels of privilege (from the application's perspective,
> > not the operating system perspective!). And then your manual testing
> > activities take on a whole new angle where you attempt to see if you
> > can call admin level menu functions while only logged in as a
> > low-privileged user, see how cross-site scripting might allow a
> > low-privileged user to escalate their privileges to those of the admin
> > level user, or even to gain unauthorised access to records of other
> > low-privileged users, etc.  For the translucent box perspective you
> > can take the manual testing procedures phase (as just one of the items
> > on its own) and expand it into sub categories, like:
> > · User Authentication and Authorisation mechanisms - test a user's
> > ability to perform unauthorised tasks such as: Access data or
> > resources that they are not entitled to; Escalate their current
> > privileges; Transact as another user; Access resources after logoff /
> > session timeouts; etc
> > · Access Protection controls - test access protection mechanisms over
> > system resources, e.g. testing if a user can: Gain unauthorised access
> > to system files, data or other resources; Circumvent the access
> > protection controls using atypical methods such as alternate character
> > sets, or URL munging / manipulation / fuzzing, etc.
> > · Data Validation - test the application for its vulnerability to data
> > injection and insertion techniques (buffer overflows, string
> > formatting problems, etc), i.e. test if the application: Correctly
> > parses any data inputted by the user; Performs adequate bounds
> > checking; Performs adequate validation of any data inputs; etc
> > · Session Management - assess whether a user can: Steal or highjack
> > the credentials or session of another user; Change their current
> >identity to that of another user (account or role "hopping");
> > Manipulate or falsify the session management controls for any other
> > means; etc
> >
> > Also, while doing translucent box testing, the WAVA tester will gather
> > new information that would be useful to try from the black-box WAVA
> > perspective – like trying to call internal web app functions before
> > you are authenticated with the server (e.g. I have encountered
> > situations where simply connecting to the login page / function and
> > being issued with a session ID / cookie allowed me to call a function
> > that lists the transaction record – in other words the authorisation
> > procedures were not being invoked when that function was called, all
> > that it looked at was that a session ID in the correct format was
> > included within the user's POST request…)
> >
> > And then you get on to the white-box testing.
> >
> > In the pentest situation, I'd be more inclined to call this the
> > infrastructure hardening review / audit – with admin level access you
> > review operating system settings and installed applications; IT
> > department organisation reviews; change control procedures; disaster
> > recovery planning; and all the other normal audit / compliance
> > objectives…
> >
> > The white-box testing in the WAVA situation could be where you include
> > categories of activities like code reviews / audits, architectural
> > design reviews, interviewing developers, threat modelling, DRP, change
> > control procedures, etc
> >
> > What do you think?
> >
> > Regards,
> > Steve


More information about the Owasp-testing mailing list