[Owasp_project_leader_list] [Owasp-leaders] Feedback needed on Project Review tool
enrico.branca at owasp.org
Fri May 30 22:54:14 UTC 2014
Thank you for your email and we are indeed missing something critical.
We need a name for the tool :D
If this will be an OWASP tool for OWASP projects we think the name
should be decided by the community, any idea on how to do this? :)
Also in the past few days we received many feedback and seems that there
is indeed a real need for an automated tool capable of analyzing code
and files and to generate reliable statistics.
Wouldn't be better to create a project for this so we avoid spamming
everyone in the list and we have a space for people interested in
contributing? Or we put this as a tool under Project Review QA? Your
On terms of feature requests the majority are related to files and issue
tracking, for the moment we have decided to tackle file analysis as that
is the easiest part for us.
For each file we are going to extract the following information:
* metadata analysis (mime type, encoding, size, mac times, acl)
* string detection (presence, absence, position, repetition)
* pattern detection (exact and partial match, sequence, occurrence)
* entropy analysis (entropy, chi2, compression, frequency)
* fuzzy hashing (file similarity)
* crypto hashing (file uniqueness)
And in term of project global metrics related to files:
* files added (total, 6/3/1 months, last 7 days)
* files changed (total, 6/3/1 months, last 7 days)
* files unchanged (total, 6/3/1 months, last 7 days)
* files deleted (total, 6/3/1 months, last 7 days)
* file comparison (one to one, many to one, one to many)
* for each commit:
- files changed (added,deleted,rewritten,renamed,modified)
- file content attribution (who modified which part)
- author tracking and action attribution (who did what and when)
* a graph to represent global project activity
At the moment we are targeting GIT repositories and we are going to
devise a way to track each file that ever existed in repository in order
to extract it and parse it. The idea is to have data granularity with
minimum information loss over time even if GIT itself deliberately
"forgets" some information.
The other requests we received are the following:
* how many bugs have been reported (total, 6/3/1 months, last 7 days)
* how serious are the bugs (high, medium, low, feature request...).
* how quickly bugs are closed (total/average/median/95e centile)
* for each bug track:
- first touch time
- open time
- average open time
- closure time
- average closure time
- confirmation time
- reopened time
- average reopened time
- time from first touch to close
- time from first touch to reopened
- time from first touch to confirmed
- open/close rate
- close/reopen rate
* how many bugs are in which state (new,open,working,closed,reopen)
* bugs by time (oldest, newest, most/least touched, age)
* a graph for open/closed issues
* measure code duplication
* measure code cyclomatic complexity
* an automated project dashboard
Please review the list and check if there is anything else that needs to
be counted/measured and we will see what can be done to align it to
We are working on files at the moment and I will send an update as soon
as we will have something new in the demo website.
On 28/05/2014 12:54, johanna curiel curiel wrote:
> Hi Enrico
> One of our tools is ohloh.net, I think much of the informnation and graphs
> provided are a starting point for measuring activity however I noticed they
> are not accurate and does not measure activity properly, therefore ohloh is
> not reliable
> I'm very glad with this initiative and please let me know how can I
> contribute to push it.
> On Tue, May 27, 2014 at 7:43 PM, Enrico Branca <enrico.branca at owasp.org>wrote:
>> That was also our idea, an OWASP tool to measure OWASP projects using
>> metrics defined by OWASP leaders :)
>> Yes could be easily automated depending on code repository.
>> At the moment we are using github and we are looking into python code,
>> for other stuff we will need to check each service API and build a
>> proper client to parse the data, but again doable once a scope is defined.
>> And for the dashboard we will definitely need help, as I am sure you
>> have noticed that our pages are far from pretty and there is almost no
>> graphic design to it. If you have ideas on which kind of data is needed
>> to populate a dashboard we can work to make one for projects using
>> github and once we have a proof of concept we can see how to develop it.
>> So if you have examples or screenshots of dashboards you like we can
>> look into it and decide graphs and data visualization. We are open to
>> ideas and together we can try to build the ideal OWASP dashboard. If
>> people are willing to help we can give it a try. ;)
>> On 28/05/2014 01:18, Josh Sokol wrote:
>>> I absolutely love the idea of using an OWASP tool to measure the quality
>>> other OWASP tools. Could we scale this to be able to automatically run
>>> periodic assessments (ideally nightly) for all code-based OWASP projects?
>>> Creating an automated dashboard would be so so so amazingly awesome.
>>> you for your efforts!
>>> On Tue, May 27, 2014 at 6:03 PM, Tobias <tobias.gondrom at owasp.org>
>>>> Hi Enrico and team,
>>>> thanks a lot.
>>>> I think this is quite interesting insight stats and could be useful as
>>>> one of the data points in our overall project monitoring.
>>>> One question: is it automated enough to maintain this stat tool across
>>>> several projects without too much effort for you guys? ;-)
>>>> Cheers, Tobias
>>>> On 27/05/14 23:30, Enrico Branca wrote:
>>>>> To contribute to the community effort on project rating and quality
>>>>> assurance at the "OWASP Python Security Project" we have decided to
>>>>> support this effort by building a tool to collect quantitative data.
>>>>> Reference: "Project Reviews Quality Assurance approach"
>>>>> This tool will be able to generate as much data as needed by scanning
>>>>> github repositories and analysing files, allowing customization of
>>>>> metrics, reports and also of data sources.
>>>>> We have run the tool against our project repository and generated some
>>>>> statistics expressed as text data and tables, in the future there will
>>>>> be graphs and infographic as needed.
>>>>> DEMO SITE --> http://www.pythonsecurity.org/stats
>>>>> Is the data produced useful?
>>>>> Did we miss anything critical?
>>>>> Anything wrong that has to be removed?
>>>>> Ideas on what needs to be added or changed?
>>>>> We are not really experts on software metrics and we are open to new
>>>>> ideas, any feedback or criticism is accepted and warmly encouraged.
>>>>> OWASP-Leaders mailing list
>>>>> OWASP-Leaders at lists.owasp.org
>>>> OWASP-Leaders mailing list
>>>> OWASP-Leaders at lists.owasp.org
>> OWASP-Leaders mailing list
>> OWASP-Leaders at lists.owasp.org
More information about the Owasp_project_leader_list