[Owasp-dotnet] (no subject)
alexjmackey at gmail.com
Thu Aug 4 04:29:26 EDT 2005
Well the issue is that at the moment we have to deal with a
traditional web interface which is very limiting and doesn't allow for
a powerful and effective user experience.
- This issue is that at present there is one developer in his spare
time working on this project and as such functionality has to be
prioritised. Spending many hours developing the UI when core
functionality is missing or buggy is a strange move. I think the
mistake I made initially when I started on this project was on
spending too much time on the UI when the application was not fleshed
out. As I added new functionality I then continually went back to
change and modify the UI.
I don't think that it is that fluid, and you will have a massive
scalability problem. I deal with websites that have thousands of pages
(although not thousands of unique forms :) which need to be tested and
processed. You will have big problems trying to handle this amount of
data in via a traditional web interface.
-Over the last month 70% of the application has been rewritten. I can
still think of many things I want to add. I agree and that's why scans
needs to be threaded or have a command line interface.
I also don't think that it is a major issue that this UI code is
trowed away (hopefully because a better one was developed). Don't
forget that AJAX is more of a 'architecture' (i.e. 'system design')
than a hard coded API (the only standards are Http, Html and
What I like from having an AJAX 'browser based approach' earlier on
the project (and there is nothing stooping somebody else developing a
Windows GUI app for this (or a Java , or a Phyton, or a Rubi, or a
C++,etc....)) is that it will make us create a server side API to will
handle all client requests (the clients are only responsible for GUI
stuff) and a database structure to support it.
But how will you do this 'command line interface' to work on a client
computer where the main engine is running on another server?
- This is all great stuff and with this scenario some sort of web
service as you suggest would probably be the best approach. Initially
I see the application and client running on the same machine. For
reasons given above (limited time and resource) im not going to devote
time to developing a web service tier. Until we have decent
functionality why would anyone want to plug into the application?
I think that we should be trying to add ASAP tests for known
vulnerabilities such as:
- Unprotected Asp.Net ViewState
- Asp.Net XSS vulnerability in UrlDecode (I think it is this one)
- Asp.Net DoS issue in Web Service function (recently disclosed by
one of the security companies)
- Blind SQL injection where detailed error messages are NOT sent
to the client
- Asp.Net Forms Authentication Vulnerability ( the '\' one)
- Detailed Asp.net errors sent to the clients (and Trace information)
- Session information stored in cookies and Forms
- Exposed web Services to anonymous users
- Potential Authorization blind spots (cases where there is a huge
amount of client side security validation)
- Open Services and Ports (we should add a basic port scanner to
this which would find basic mistakes like open SQL's 1433, RDP's, RPC,
- Agreed these are good things to be testing. Not sure about the port
scanner, there are many very effective and mature programs out there
already. I think we would be better concentrating on Web application
But for this tool to be used in a 'real live' scenario it needs to be
able to run tests on remote machines :)
- Which it does? ;)
What is the current performance of Beretta in Brute force requests?
That is, how many requests can it handle per second?
- I havent done any measurements as yet given that we only just have
workable functionality! Functionality rather than speed has been the
priortity. The number of variables here are enormous, what data are
you transmitting, where to?, whats the spec of the machine?
Last time I played with managed HttpRequests I found them far too slow
and not very scalable (mainly a threading problem). I think that we
might have to go to raw sockets and build own own http requests, so
that we get the speed we need to be able to pull of some of the
-Again this would be great but the reason I used the Microsoft http
object was all the functionality it provides and the time to develop
the raw sockets stuff. Id imagine that and the cookie handling code
would take a while to get right.
More information about the Owasp-dotnet