[OWASP-TESTING] OWASP WAPT submission

Roelof Temmingh roelof at sensepost.com
Thu Aug 11 08:22:03 EDT 2005


In terms of unlinked files...just a quick comment - the BackEnd miner
inside of Wikto (http://www.sensepost.com/research/wikto) does a splendid
job of finding directories and files within these directories. It also
works nicely with friendly 404 (or 302s or whatever) etc etc..

Dont know if this is relevant or if you mentioned it in the doc...or if
there is a seperate tool section..

my 2c,
Roelof

PS: we have recently released a couple of other web application testing
tools - whoever is writiing section on app testing (input validation,
brute forcing, state tracking) - be nice if you speak to me (when looking
at testing tools).

=====================
Roelof Temmingh
+ 27 12 460 0880
GMT+2
=====================
On Thu, 11 Aug 2005, Javier Fernandez-Sanguino wrote:

> Mauro Bregolin wrote:
>
> > Daniel and all,
> >
> > please find attached my submission.
> > Feel free to review it and send comments, particularly if you have
> > relevant information regarding bibliographic references not mentioned in
> > the text
>
> With respect to the "Old, backup and unrefereced (sic) files"  I want
> to point out (again) that Dafydd Stuttard wrote an excellent text on
> this matter that could be reused, please check the CVS or the file
> attached.
>
> As for the text here's some personal recommendations for improvement:
>
> Unreferenced application files are not only dangerous because they
> provide you with internal source code but because they might be
> vulnerable! Imagine the case of a web administrator taking a file that
> has an SQL injection, making a backup copy of it, editing it to remove
> the SQL injection present and leaving. If he forgets to remove the
> backup file _And_ the backup file's extension will be executed by the
> webserver (think: original: login.jsp, backup: login.old.jsp) then you
> might think you are safe when you are actually just as vulnerable as
> before (just that people don't know that the file exists).
>
> You also fail to mention files that do not belong in the web server
> but are typically generated by upload tools (think WS_FTP.log files).
> These files are "unreferenced" in the sense that they should not be
> there, and contain sensitive information (like the user that ftp's in
> the web server) that might be valuable to the user.
>
> Finally, many people are unaware or dimiss the problems with these
> files because they think that the default index.html pages prevents
> all of them from being seen (security through obscurity), they say
> "why care? only an internal user with knowledge can know about them?"
> and then the next web server security bugs comes around and makes all
> the directories browsable because a module fails to load the default
> index .{asp,html} page when fed malicious input, or the web server is
> not properly configured and serves directory listings (the "I expected
> everyone to go through the redirection that sends you to
> "/login/login.asp", I thought nobody could access "/login/" directly
> and get a listing of the directory there!)
>
>
> Some White box strategies for detecting files (which I believe are
> easier to implement and will produces less FP than the scripted
> methods you suggest)
>
> - Look for files who have not been accessed in a long time, they might
> be backup or unreferenced file. Rationale: If this is a web server,
> most of the files will have their "last access" timestamp modified
> everytime the web server reads them in to provide them to an end-user
> or run them through an application engine. If a file has not been
> accessed in, say, a year. It should be analysed. Of course, you need
> to be using a filesystem that stores access times (most modern
> filesystem will do)
>
> - Correlate logs of the web server with actual files. Rationale: you
> can go through the web server access logs and retrieve the list of
> files requested by clients and then compare that with the actual files
> in the server. Those files that are _not_ on web server logs (if logs
> go back sufficient time) are potential unreferenced files
>
> In Black Box testing you are missing some useful inputs sources:
>
> - Archive.org, better than google since it will _not_ timeout the
> pages after time so you can actually see the old files that were in
> the web server 3 years ago.
>
> - robots.txt files. Some people believe they can "hide" files from
> there using that file, they don't think that the robots.txt file is
> only honored by, yes, robots.
>
> - HTML comments. Either of active content (think server side includes
> which have been disabled) or forms that have changed their action and
> maintain (in a comment) the previous action CGI used.
>
> I would suggest you add these, as well as review the attached text
> (the one I mentioned before) and merge many of its suggestions and
> comments in your text.
>
> Regards
>
> Javier
>





More information about the Owasp-testing mailing list