[OWASP_PHPSEC] Finding and indexing logs
sven at rtbg.de
Sun Jul 28 10:14:10 UTC 2013
Well, I think it has been discussed on the mailinglist or somewhere if
it would be a good solution to simply let the developer decide to pass
in a logger that conforms to the PSR-3 logger interface definition.
Any viable existing log framework supports all features you are asking
for out of the box. For example, I am a fan of Log4PHP, and it would
really annoy me if I had to use a different logger just for this library
- and in the worst case, I am forced to use it and cannot disable it.
Allowing me to pass a logger into the elements I want to use enables me
to integrate the lib into the already set up logger configuration.
Apart from that: Any existing log framework allows to send messages to
multiple targets, mail being one of them. Probably in two or three
different kinds of implementation:
1. Send mail immediately.
2. Send one mail at the end of the script.
3. Send one mail only if the log level of any message exceeds a
threshold (like the FingersCrossedHandler of Monolog).
If you want to build a log framework yourself, have a look at the
feature description of other log frameworks to get the idea.
Am 28.07.2013 12:02, schrieb rahul chaudhary:
> yes, true....but we were also thinking of putting functions to email
> critical logs to the admins....for this I need to have something to search
> all the logs for critical events.....or there is another way..as soon as a
> critical event is generated, send that event in the mail....
> On Sun, Jul 28, 2013 at 5:59 AM, Sven Rautenberg <sven at rtbg.de> wrote:
>> I wonder if it should be the task of this security centered library to
>> provide a fully sophisticated log framework completed with log searching
>> There are already tools to efficiently browse logfile output. Either the
>> logs go to syslog and are monitored by automated tools, or there are
>> tools like "logstash" and "greylog2" in place that do all the indexing
>> and browsing.
>> Or the logfile is considered small enough that "grep" will be a working
>> tool for the situation.
>> Am 28.07.2013 11:15, schrieb rahul chaudhary:
>>> Hello All,
>>> In logs,
>>> Logs will be generated and the files will be huge. So, searching in log
>>> files would be very heavy.
>>> A typical Log will look like this:
>>> [message] [WARNING] [HIGH] [filename] [line no] [time and
>>> Now to make a function that indexes log according to some element say we
>>> want all "WARNING" logs.
>>> To do this, I am thinking of creating a function like this
>>> *findLogs($filename, $orderBy)* where *filename* would tell the location
>>> the original log file and the *orderBy* will tell by which element would
>>> you like to sort (for now say "WARNING")
>>> So, this function will create a file inside temp folder that will go to
>>> that log file and will index all the logs according to *orderBy*.
>>> Then the searching in that temp file would be faster.
>>> But then the issue is that the temp files can only be used once as the
>>> files will constantly get updated.
>>> Am I right or do I need to do something else ?
>>> OWASP_PHP_Security_Project mailing list
>>> OWASP_PHP_Security_Project at lists.owasp.org
>> OWASP_PHP_Security_Project mailing list
>> OWASP_PHP_Security_Project at lists.owasp.org
More information about the OWASP_PHP_Security_Project