[OWASP-LEADERS] Prior Art

Mark Curphey mark at curphey.com
Mon Dec 1 14:56:33 EST 2003


RE: Description of Prior Art Needed to Invalidate Sanctum Patent


United States Patent No. 6,584,569 to Reshef et al. and assigned to Sanctum
Ltd. ("The Sanctum Patent") discloses a scanner for automatically detecting
potential application-level vulnerabilities or security flaws in a web
application. The independent claims of the Sanctum patent generally relate
to a scanner that (1) traverses a web application in order to discover and
actuate the links therein, (2) analyzes messages that flow or would flow
between an authorized client and a web server in order to discover elements
of the web application's interface with external clients and attributes of
these elements (such as links, fill-in forms, fields, fixed fields, hidden
fields, menu options, etc.), (3) generates unauthorized client requests in
which these elements are mutated, sends the mutated client requests to the
web server, receives server responses to the unauthorized client requests,
and (4) evaluates the results thereof.

Under United States patent law, a patent claim is said to be "anticipated,"
and will be deemed invalid, if each and every element of the claim was
described in a single prior art reference published more than one year prior
to the date the patent application was filed. Because the application for
the Sanctum patent was filed on March 3, 2000, to invalidate a claim of the
Sanctum patent, a single prior art reference must be found that was
published on or before March 2, 1999 that describes each and every element
of that claim. Accordingly, the prior art reference should describe a web
scanner that:

(1) traverses a web application in order to discover and actuate the links
therein;

* Also called a "web crawler," such a scanner explores the entire code
for a website and discovers all the links, or URLs, contained on the
website.
* The scanner then actuates each link found on the website to generate
HTTP requests for transmission to the web server (exercises the links).
* If the discovered link requires user input, such as when the
discovered link includes a form, the scanner provides fictitious values as
input based on the field or data type.

(2) analyzes messages that flow or would flow between an authorized
client and a web server in order to discover elements of the web
application's interface with external clients and attributes of these
elements (such as links, fill-in forms, fixed fields, hidden fields, menu
options, etc.);

* Here, the scanner sends the HTTP requests generated above for each of
the discovered links and receives the associated responses from the web
server.
* The responses are then analyzed, in the same manner in which the
original website was analyzed, to discover all of the links contained
therein. The responses are also scanned for other application interface
elements, such as data parameters, and their attributes (such as links,
fill-in forms, fixed fields, hidden fields, menu options, etc.).
* Up to this point, the scanner essentially explores and exercises all
of the links on a website by sending authorized requests, then analyzes the
responses for more links and interface elements (explores multiple layers of
the web application).

(3) generates unauthorized client requests in which these elements are
mutated, sends the mutated client requests to the web server, receives
server responses to the unauthorized client requests; and

* At this point, the scanner switches from sending authorized requests,
to creating and sending unauthorized, mutated requests (also called
"exploits"). Thus, rather than exercising the features of a website to
ensure it works properly, the scanner sends unauthorized or improper values
to the web server to ensure that an error message is received in response.
* The scanner creates a mutated request for each interface element
discovered above. The mutated request created by the scanner depends on the
type of interface element at issue. For example, if the interface element
is a numeric field, the scanner will create a mutated request that contains
text as input, or if the interface element is a link, the scanner will
create a mutated request that appends ".bak" to the link's path.

(4) evaluates the results thereof.

* Finally, the scanner evaluates the response to the mutated request to
ensure that the web server did not accept the unauthorized input value.
* One example of such an evaluation would be to look for responses
containing keywords, such as "error," "sorry" or "not found." If such words
are not returned, the scanner would conclude that the mutated request was
accepted and that the web application is vulnerable to attack, i.e., that
the website contains a security flaw.

A prior art reference that describes all of the above elements will likely
be introduced as a "security scanner," "web application tester,"
"vulnerability checker," or the like. Several references have already been
identified that describe the first two elements listed above. These "web
crawlers" exercise all the embedded links on a website to either create a
map of the web application or to test the web application for broken or
invalid links. What is needed, however, is a single prior art reference
describing a scanner that tests not only whether the links and fields on a
website are functioning properly when activated in an authorized manner, but
also whether the links and fields are properly designed such that the web
application will reject requests created in an unauthorized manner. In this
regard, the prior art reference would be aimed at providing a website
administrator with assurance that his website is not vulnerable to attack by
a computer hacker seeking to manipulate the fields on the website to, for
example, alter the purchase price of an item for sale on the website.






More information about the OWASP-Leaders mailing list