[OWASP-TESTING] finally!

Mauro Bregolin mauro.bregolin at gmail.com
Mon Apr 18 04:49:07 EDT 2005


Jeff,

I agree with the posting you refer to. What do you mean exactly with "we
should keep the techniques (scanning, manual pentest, static analysis,
manual code review) separate from the purpose (audit or test)"?

Judging by how people replied to my original post, it appears there's not a
unified consensus right now.
Perhaps it is worth trying to synchronize everybody on this matter before
things get started?

Concerning port scanning, footprinting, etc., my point essentially is the
following. Of course they're part of a traditional pen test; and here we're
not talking about pen testing, but assessing a web app - which may or may
not be part of a pen test-like type of assignment. This meaning you might be
requested to "assess application X which you can access at name.domain",
without any explicit notion or expectation of port scanning and
fingerprinting any specific IT infrastructure. Nor is the testing guide
supposed to waste time on port scanning specifics, etc.
What I wanted to convey is that, in my opinion, there are a number of
activities whose purpose is that of "discovery", which should be emphasized.
In contrast with a pen test, in which the discovery process is well
characterized and can precede further analyses, discovery-related activities
in a web app assessment cannot be fully factorized at the beginning.
For example, determining architectural aspects of the application (such as
identifying components like web servers, application servers, DBMSs etc.)
might in part require information that you gather during the actual analysis
of the application (think of gleaning info from error messages or other
diagnostics).
My comment was more on the need of giving dignity to a set of separate
activities (or information collected), realizing that semantically they
fulfill the purpose of "discovering" key aspects of the application, than on
restructuring the testing guide TOC. Again, the difficulty lies in the fact
that you can't group them nicely, but logically there exist a number of
steps in the assessment process whose purpose is (directly or as a
by-product) related to "discovery", which I believe is worth emphasize.

Mauro


-----Original Message-----
From: Jeff Williams [mailto:jeff.williams at aspectsecurity.com] 
Sent: sabato 16 aprile 2005 15.45
To: Harinath Pudipeddi; 'Keary, Eoin'; 'Mauro Bregolin'; 'Daniel Cuthbert';
owasp-testing at lists.sourceforge.net
Subject: Re: [OWASP-TESTING] finally!

I think we should keep the techniques (scanning, manual pentest, static 
analysis, manual code review) separate from the purpose (audit or test). 
There's some more on this in this thread on webappsec from a while back.

http://seclists.org/lists/webappsec/2005/Jan-Mar/0360.html

--Jeff

----- Original Message ----- 
From: "Harinath Pudipeddi" <harinath.pudipeddi at softrel.org>
To: "'Keary, Eoin'" <eoin.keary at ie.fid-intl.com>; "'Mauro Bregolin'" 
<mauro.bregolin at gmail.com>; "'Daniel Cuthbert'" <daniel.cuthbert at owasp.org>;

<owasp-testing at lists.sourceforge.net>
Sent: Saturday, April 16, 2005 12:33 AM
Subject: RE: [OWASP-TESTING] finally!


> Hello Eoin,
>
> I differ to agree with your first paragraph on Testing and Audit. Code
> Audit and White Box are two different approaches to ensure Quality and
> Stability of code. If you are seeing White box testing as audit for
> code, then you are missing key ingredients in making your code "Error
> Free". We have many white box testing tools in the market today. Also,
> the approach for White box testing is quite different than auditing.
>
> Hari
>
> -----Original Message-----
> From: owasp-testing-admin at lists.sourceforge.net
> [mailto:owasp-testing-admin at lists.sourceforge.net] On Behalf Of Keary,
> Eoin
> Sent: Friday, April 15, 2005 7:28 PM
> To: 'Mauro Bregolin'; Daniel Cuthbert;
> owasp-testing at lists.sourceforge.net
> Subject: SPAM-LOW: RE: [OWASP-TESTING] finally!
>
> Personally we view whitebox as audit and blackbox as testing.
> Audit we see, say, the source code and review if it conforms to internal
> policy and best practice.
> Testing is from a user perspective, what the user sees. No code exposed
> just
> inputs and corresponding outputs.
>
> Regarding port scanning and footprinting these are initial phases of a
> pen
> test, the assessment phase. And it seems correct to cover assessment
> tasks
> in their own section.
> Information leakage is also a part of the assessment phase but is
> closely
> related to the attack phase as a slight adjustment to the attack vector
> can
> lead to an exploit.
>
> Regarding patching and versions of appserver this is related to the
> "secure
> code environment": this includes configuration and deployment,
> versioning,
> administration policy and redundancy/failover.
>
>
>
>
>
>
> -------------------------------------------------------
> SF email is sponsored by - The IT Product Guide
> Read honest & candid reviews on hundreds of IT Products from real users.
> Discover which products truly live up to the hype. Start reading now.
> http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click
> _______________________________________________
> owasp-testing mailing list
> owasp-testing at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/owasp-testing 





More information about the Owasp-testing mailing list