[OWASP-TESTING] Checking in

Javier Fernandez-Sanguino jfernandez at germinus.com
Thu Nov 27 04:22:50 EST 2003


If you don't mind, I'll make some comments on your comments.

Mark Curphey wrote:

  > Default Files
> 
> 1. In the description I think it may be worth accentuating the fact that
> there is only a security issue in default files when they have security
> vulns. One of my pet peeves are consultants reports listing default files
> like documentation etc where there is no security issues at all. It's a
> housekeeping issue (I guess) but not a security one in my opinion.

Default files like documentation do disclose internal information, that 
is a low risk security issue, but a security issue after all. It's just 
like banner grabbing, it's not a direct attack, but it might disclose 
more information than you want to. I would move the documentation files 
item to the end of the list, but would keep it there.

Notice that there is another item missing in the list which I'm not 
really sure how to describe. It's related to the way Lotus Domino 
handles the webserver. In Domino, the webserver might have default 
_databases_ enabled which might disclose sensible information but are 
not really documentation 'per se'.

I developed a plugin for Nessus (quite some time ago) to check these 
database, so maybe the code speaks for itself:
http://cgi.nessus.org/plugins/dump.php3?id=10629

> 2. In the "how to test" description I think it would be worth-while
> explaining that you are looking for the instance of the file. To do that you
> need to see an HTTP 200 or equivalent server response and this is
> essentially what tools like Nikto etc do. Maybe a http trace type sequence
> might be a good visual example....That leads neatly onto ...
> 
> 3. IMHO tools like Nikto and Nessus are really bad at finding default files.
> That is because they can't tell the difference between an HTTP 404 (or 400
> series server response) from a HTTP 200 carrying an HTML 404. As an example
> fire one of them off against OWASP.org and you'll find we have all the IIS
> files under the sun (on a Unix box!). The commercial scanners have solved
> this by allowing users to define a 404.

Nessus has a specific plugin (no404.nasl, see 
http://cgi.nessus.org/plugins/dump.php3?id=10386) which tries to avoid 
this by making some controled tests and determining how the server 
answers to non-existance files. From my experience it does determine 
issues better than some commercial scanners. It, however, does not make 
any assumption based on other results on whether those pages should be 
there or not (it does not say "Oh, this is Apache, I will not check X 
and Y") since is that fundamentally flawed (there might be a reverse 
proxy in the middle, the banners might have been changed, etc..)

Nikto/Whisker do not attempt to do this at all. They are really more CGI 
scanners than vulnerability assesment scanners. So they do usually turn 
a lot of false positivies.

> 4. I am not sure white box and black box are good terms for this on
> reflection.

I think there is a difference. The 'white box' should describe that the 
auditor should be analysing the configuration of the web server (ie. 
apache's http.conf) and the file system structure directly. Whileas the 
black box is trying to (remotely) assess with a tool or manually whether 
these files exist.

> 5. How about tools like IIS lockdown that look for and remove sample files ?
> Is there an Apache equiv anyone knows about.

There is a lot of documentation, which maybe should be added to the 
references for either the document as a whole or for the references 
which are specific to vendor's implementations (such as this one)

Lotus White Paper: A Guide to Developing Secure Domino Applications' 
(december 1999)
'Apache Security Configuration Document' from Intersect Alliance, 
http://www.intersectalliance.com/projects/ApacheConfig/
'NTERNET INFORMATION SERVER 4.0 SECURITY Graded Security Configuration 
Document' also from from Intersect Alliance,
http://www.intersectalliance.com/projects/IIS4Config/index.html

NSA guides (available at http://nsa1.www.conxion.com/support/download.htm):
Guide to the Secure Configuration and Administration of iPlanet Web 
Server, Enterprise Edition 4.1
Guide to the Secure Configuration and Administration of Microsoft 
Internet Information Server 4.0
Guide to the Secure Configuration and Administration of Microsoft 
Internet Information Server 4.0 (Checklist Format)
Secure Configuration of the Apache Web Server, Apache Server Version 
1.3.3 on Red Hat Linux 5.1

> 
> Unreferenced Files

Just to add an example of a different webserver (not Apache nor IIS) it 
would be nice to add to the 'Information obtained through server 
vulnerabilities and misconfiguration' the following sample:

- Domino browsing through the ?open directive
(http://cgi.nessus.org/plugins/dump.php3?id=10057)

Also, in 'Use of publicly available information' I would include 
archive.org:

- Finally, the Internet Archive (www.archive.org) keeps images of sites 
in different epochs that are publicly accesible through their "Wayback 
Machine" . Older versions of sites can include references to content 
that is still available in the site but not linked from current pages, 
it can also give hints to older website structures and directories which 
might be still available.

> 
> 1. This write up focuses solely on pen testing. How about comparing the
> deployed content to the CMS or app deployment tool? This is typically what
> happens in enterprises in my experience. Maybe we should say that this is
> really a pen testing technique only ?

I don't think so. It should be trivial in some servers to assess from a 
"white box" point of view, the website structure (as perceived by, for 
example, a web robot that follows the URLs) vs. the filesystem 
structure. Any file which is not linked to, but resides in the 
filesystem would be an unreferenced file. This is more difficult to do 
in servers with dynamic content, however, but he can use real website 
logs to determine which files/applications are being accessed and which 
files are just "sitting there".

In the "white box" testing, the auditor can determine with more accuracy 
if the page is really unreferenced or not. For example, he can also make 
use of the filesystem timestamps: he can search for files that have not 
been accessed for a year, if he knows that the system was updated in 
date X he can try to look for files that were created before that date, 
etc. I would consider adding this into the "white box" part.

> 2. Under blind guessing might be worth mentioning the HTTP / HTML 404 again.
> Great stuff though!

I agree. It's very good stuff!

> 4. Might be worth showing an example of making an HTTP call to list the
> directory contents where no default page for that dir has been specified ?

I agree. That's a very common vector to retrieve this kind of information.

Regards

Javi






More information about the Owasp-testing mailing list