[Owasp-topten] T10 RC2
Jeremiah Grossman
jeremiah at whitehatsec.com
Fri May 25 17:24:24 EDT 2007
Here is my feedback (rewrite) of the “VERIFYING SECURITY” sections of
the OWASP Top 10 2007. Sorry that this comes at such a late stage.
Some quick things:
1) The Automated approaches should be separated into Automated
penetration testing tools and Automated source code scanning tools,
which are already separate from the manual approach. This seems to
make the most sense as the capabilities of the tools are different
between them and shouldn’t be lumped together.
2) For the automated penetration testing tools, we should include
generic language at the top of the document, which the guidance is
predicated on. For instance the scanner must be configured to get a
complete crawl of the website with application functionality, able to
maintain login state during all stages of testing, and able to parse
the URLs to find the injection points.
3) My text is only covering the automated penetration testing
tools, as this is where my expertise is, and should be sanity checked
by at least one other similar vendor. Could use more content cleaning
and examples.
4) On the automated source code scanning tools, someone else
familiar with the technology should be enlisted to write that text.
I’m not intimate with how far that has come.
A1: Automated penetration testing tools are adept at identifying
reflected XSS vulnerabilities through exhaustive parameter injection
testing. The same tools perform marginally at identifying persistent
XSS for variety of reasons including the output maybe not accessible
to a scanners current authentication level, echo only at some point
in the future, or appear on another seemingly unrelated website.
Automated penetration testing tools also have a challenging time
identifying DOM-based XSS because of the difficulty in automatically
navigating and executing the browser DOM.
A2: Automated penetration testing tools are adept at identifying many
forms on Injection Flaws including SQL / LDAP/ XML/ XPath Injection,
through the use of detailed error message generated by the system. If
however the application suppresses the server errors, as they should,
it can significantly hamper automated tools, but the code may still
be at risk. In recent years tools have become better at testing for
these types of issues while blind, without the use of error messages,
but the false positive and false negative rate is still considered high.
A3: Vulnerability scanning tools do a fairly decent job at
identifying the majority of vulnerable parameter values used in a
file includes. They do suffer from notable false negative rate in
circumstances where the command or file execution results are not
apparent by analyzing the resulting response.
A4: Vulnerability scanning tools will have difficulty identifying
which parameters are susceptible to manipulation or whether the
manipulation worked.
A5: Vulnerability scanning tools are largely unable to detect the
CSRF vulnerabilities without some form of human assistance. The
challenge is that every “important” feature of a website must be
checked and technology has a difficult time understanding importance
in context. Further, should a CSRF attack be successful, its hard for
a scanner to generically determine what the results should look like
for vulnerability identification.
A6: Vulnerability scanning tools do a good job at identifying
information leakage and improper error handling in many cases. They
can spot internal IP address, social security and credit card
numbers, memory stack traces, debug messages, internal ip addresses,
etc. However, there are other types of data and data formats that
websites should not be revealing that are difficult for scanners to
detect. For instance software revision numbers, intellectual
property, or passwords. If the data is encoded or slightly
obfuscated, even using Base64 or Rot13, the job is made that much
harder for the scanner.
A7: Vulnerability scanning tools have a very difficult time
identifying vulnerabilities in custom authentication and session
management schemes. They can however perform rigorous mathematical
analysis on session token and credential by sampling hundreds by
comparing and looking for patters of predictability. Areas or
predictability or lack of sufficient entropy can be telltale signs of
weakness requiring further review.
A8: Vulnerability scanning tools have very limited or zero visibility
into the cryptographic storage mechanisms of a website or web
applications.
A9: Vulnerability scanning tools can verify that SSL and level of
supported encryption is used on the front end and able to find many
SSL related flaws. However, they do not possess visibility into the
backend connections between web server, application server, and
database to verify they are secure.
A10: Vulnerability scanning tools are able to guess the names of
hundred or even thousands pages and on website. They’ll often uncover
orphaned files and directories including backup files, source code,
intellectual property and other data and functionality that should
not be present on the website. What they cannot do is assess the
value of this data/functionality or determine if a user of a
particular access level should be able to access it in accordance
with the expected business logic of the website.
On May 9, 2007, at 3:07 PM, Andrew van der Stock wrote:
> Hi folks,
>
> At long last, here's the OWASP Top 10 2007 RC2. We're going to push
> this out
> next week at the conference, so it's edits and emergency changes only.
> Please download and read.
>
> PDF: (980 kb)
> http://www.owasp.org/images/f/f2/OWASP_Top_10_RC2.pdf
>
> Word: (520 kb)
> http://www.owasp.org/images/e/e7/OWASP_Top_10_2007_RC2.doc
>
> Translators, you can start now as I'll use tracking changes between
> RC2 and
> Final so you can update just those few changes.
>
> Thanks,
> Andrew
>
>
More information about the Owasp-topten
mailing list