[Owasp-leaders] Feedback Documentation Evaluation Proposal

johanna curiel curiel johanna.curiel at owasp.org
Mon Aug 18 14:02:19 UTC 2014


Hi Gary and everyone who feedbacked on this

Based on your comments I have resumed the most important objections against
some of the criteria:

*Clarifications*
*Relevance to the audience:*
I see some people have confused 'Relevance to the audience' criteria  where
I proposed the use of hit count with 'relevance to the subject'. The amount
of hits(views) is just an indicator of the subjects people are reading most
at OWASP pages. Not necessarily will mean that if people are not reading
your project page, your project get less points. Example: SQL injection is
the number 1 most viewed topic, if your project handles this subject, it
means that the project contains a topic very relevant to the public looking
for information regarding this.An indicator but not a main factor to
promote a project.

*Accessibility:*
With this criteria we mean that the document should be easily available to
internet users. Whether is a wiki, a PDF downloadable, text file but where
people do not need to fill in forms before they are able to download the
document or read it right away. Also, if the project leaders decided to
make the document available through a repository such as github or wiki,
they are being proactive for maintenance and versioning control. Thats all
involved in this criteria, so please do not 'extend' the interpretation of
the word 'accessible, no need to make it more complex that is described
here.

*Formatting and branding:*
It is not the project leaders and volunteers forced to deliver a document
'owasp branded'. Once there documentation have reached a status, such as
flagships, it means that it can be rebranded by the OWASP graphic designer.
I admit that this means money and probably we will need to set a budget for
documentation to be rebranded and layout like that, but please keep in
mind, is not an obligation in order to become LAB or flagship, just a
proposal to set a budget for these projects so they all are nice branded in
a OWASP format.


*Objections*


   1. *-Using of automated tools to check plagiarism/grammar:* Objections:
   Trust results, difficult to apply in OWASP documents. etc:

The use of automated tools doesn't mean that we will not analyze the
results.All results will be properly analyzed and once done that we will
feedback first with the project leaders. What we want to avoid is people
copy/pasting existing books with copyrights.  These tools are capable of
showing the source. If we notice is a copy/paste form an existing OWASP
document then we need to verify who wrote it first. Keep in mind this will
be used just to avoid copyrights infringement , where people copy/paste
existing books and literature of authors(especially Safari oreilly) and
this is quite easy to verify.Thats the main focus, but not if people
copy/paste existing OWASP docs or other open source documentation. It just
make it less original but not per se illegal.

   - -*Accessibility*: Different interpretations regarding this criteria:

Hope the above clarifications have made it clear

   - *-Formatting and branding*: are projects forced to use one?

No  project is forced to use branding. OWASP as organization will brand and
pay for branding the documents once they reach flagship status

   - -*Open source*: If the researcher or writers use proprietary data they
   cannot disclosed, this should not be a problem nor should this data be open
   source

This is about trust. If a researcher cannot disclose the source of its data
, you as a reader are the one judging if the results presented in the
document are valid or not. I personally dont think we can include this
criteria, but yes it makes it difficult to trust. By this people who
feedback agreed the criteria of 'open source' should not be included

*Final comment:*

Please keep in mind that OWASP does not have staff to review projects at
high level. We are trying to create a criteria that covers at least the
minimum quality, our credibility and reputation depends on how updated and
well written these docs are. People take owasp very serious and if we do
not take care of the quality of our projects we lose reputation and
credibility. I have met different security experts(big names) providing me
heavy feedback on OWASP documentation and not that positive. We want to be
the best we can, lets think of ways to make this possible, not ways to make
it impossible ;-P.* Don't focus on the problem, focus on the solution.*











On Sun, Aug 17, 2014 at 5:54 AM, Gary Robinson <gary.robinson at owasp.org>
wrote:

> Hi Johanna,
>
> Thanks for pushing this discussion, as we discussed at AppSec EU the
> accuracy (correctness) of the documentation content is something we have to
> get right to maintain our credibility.  I think we have to 'see the woods
> for the trees here', we within OWASP know about lab, flagship, incubator,
> etc levels of project (documents or wiki pages), but the people viewing our
> content do not. They just google for a subject type, see a OWASP wiki page,
> and view it or download it - if that document is inaccurate (or just plain
> wrong because it's not been reviewed) they won't reason "Oh well it was
> just a lab project", they will lose faith in OWASP.  This is my worry.
>
> Question: Should a wiki page/document be put on the OWASP site (and
> therefore available for download, representing OWASP) if it hasn't been
> checked for basic accuracy?
>
> I believe we need to address the correctness/accuracy points earlier,
> before a document becomes a link on the OWASP site.  Another option would
> be for incubator/lab documents/pages to have a large warning, in plain
> English, stating that the page/document is not finished (or beta form).
>
> Specific comments on your proposal document:
>
> * I really like the point on page 3 about the maintainability of the
> project, I'd like to see this explored and rolled out.
> * Agree that it's hard to find good reviewers who can spend time
> reviewing... would be good to put some though cycles into this as a
> solution, is the Summit an option?
> * Credibility and Accuracy are mentioned in bullet points in the 1st
> phase, but not really addressed until the last phase, in terms of
> readability of this proposal document I found this misleading.
> * For the second phase, can you put more wording/description in as I'm not
> sure I get how evaluating what page/docs get the most hits indicates
> credibility - I know of the most popular newspapers in the UK, but I
> wouldn't hold them up as a beacon on accuracy and credibility.
> * In the third phase there's a sentence "In order to measure this part we
> will execute a research to find discussions of readers about the document"
> - I'm not sure I understand that this means or how it measures accuracy?
> * For relevance to subject we're measuring the page hits, again I'm not
> sure how this measures relevance.  If I write an wiki page on "Risk of
> writing awkward code for maintainability" I'll get some hits, if I name
> that article after some sex scandal with a pop/movie star it'll get a lot
> more hits.  Shouldn't relevance be a subjective measure by OWASP
> members/experts rather than a popularity contest?  The code review guide
> will never win the X-Factor, but its relevant to computer security.
> * Page 4 mentions "top level contributors", can you expand on what this
> means?
>
> Thanks again for putting this together, I think it's important!
>
> Gary
>
>
> On Fri, Aug 15, 2014 at 8:50 AM, Munir Njiru <munir.njiru at owasp.org>
> wrote:
>
>> Hi Johanna,
>> Just got to read the draft, and most of the points seem valid. especially
>> the part of us seeming abit inadequate to the same effect. It is actually a
>> good proposal which if we put into good use we can make things richer in
>> information, awareness and proper benchmark standards.
>>
>>
>> On Fri, Aug 15, 2014 at 12:42 AM, johanna curiel curiel <
>> johanna.curiel at owasp.org> wrote:
>>
>>> Well, only 3 persons did actually feedback on this proposal.
>>> 2 agreed 1 disagreed.
>>>
>>> Participation is clear low and that worries me since, lets keep in mind,
>>> we are trying to develop a methodology to review document projects
>>>
>>> People , keep in mind that *no staff is reviewing projects at owasp*,
>>> just a few volunteers like me, and maybe 2 more occasionally.Staff member
>>> Kait-Disney has help us maintain the wiki from inactive projects, but
>>> reviewing projects such as they at least can build or have an open source
>>> repository that works has been done by a couple of volunteers
>>>
>>> It seems like the community has no interest to have a proper review
>>> mechanism?
>>>
>>> Does it have sense that I keep on pushing project reviews when it seems
>>> that only 3 persons want them?
>>>
>>>
>>> On Sun, Aug 10, 2014 at 6:37 PM, johanna curiel curiel <
>>> johanna.curiel at owasp.org> wrote:
>>>
>>>> Hi Larry
>>>> Thank you for the feedback. I think others agree with you regarding the
>>>> owasp top ten, and that's why we need people providing feedback.
>>>>
>>>> First of all, the proposal is combination of things I researched but
>>>> also include other members ideas and comments.
>>>> Just want to clarify to you that the proposal does not contain only my
>>>> ideas.
>>>>
>>>> This are my clarifications regarding your comments
>>>>
>>>>
>>>> On Sunday, August 10, 2014, Larry Conklin <larry.conklin at owasp.org>
>>>> wrote:
>>>>
>>>>> Johanna, thanks for putting this together. I think as a draft this is
>>>>> a good start. My suggestion would be to scale back on the scope and
>>>>> implement in a step by step approach.
>>>>>
>>>>> What is going to be your approach evaluating feedback?
>>>>>
>>>>> All doesn't look like people are really are commenting on this. This
>>>>> is an important proposal that needs feedback from the entire community.
>>>>> Below is my feedback on this proposal.
>>>>>
>>>>> Larry Conklin, CISSP
>>>>>
>>>>> ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>
>>>>> *Qualitative and Quantitative Content Audit Feedback*
>>>>>
>>>>>
>>>>>
>>>>> I would like to divide my feedback into two sections. The second
>>>>> section to be a more general feedback of the Content Audit proposal and the
>>>>> first to be more about what is and is not a flagship project centered on
>>>>> the OWASP Top Ten Project.
>>>>>
>>>>>
>>>>>
>>>>> I have read several emails concerning the OWASP Top Ten project,
>>>>> concerning the project does not release enough of its own and vendor
>>>>> supplied data, and that it takes away from other good projects. The second
>>>>> concern I find to be contrary to common sense
>>>>> <http://ninjawords.com/common%2520sense>. I see OWASP Top Ten mention
>>>>> of different web sites, pdf research papers and books (
>>>>> http://pluralsight.com/training/Courses/TableOfContents/web-security-owasp-top10-big-picture,
>>>>> the Tangled Web: A Guide to Securing Modern Web Applications, Systematic
>>>>> Techniques for Finding and Preventing Script Injection Vulnerabilities),
>>>>> etc just to mention a few references. All of these references provide the
>>>>> reader a chance to learn more about OWASP and the different projects inside
>>>>> of OWASP. I have even seen SANS ppt presentation talked about ZAP as a tool
>>>>> to find injection issues where the main subject of the ppt was OWASP Top
>>>>> Ten.  My company uses OWASP Top Ten as part of their annual secure coding
>>>>> standards for all developers. Something so prevalent and mention so many
>>>>> times has to be good for OWASP if not for any thing else but to make
>>>>> application security relevant, open and visible. If OWASP Top Ten were off
>>>>> the mark it simple would not be referenced as many times as it is. So for
>>>>> that reason no matter what the Content Audit policy is I feel strongly that
>>>>> OWASP needs to keep OWASP Top Ten as a flagship project. On the issue about
>>>>> publishing data that the OWASP top ten gathers both vendor supplied and
>>>>> privately gathered I feel I don’t have enough information to make an
>>>>> intelligent decision or comment.
>>>>>
>>>>>
>>>>>
>>>>> On a more general aspect on the proposed content audit policy I do
>>>>> have some comments.
>>>>>
>>>>>
>>>>>
>>>>> *Preliminary*
>>>>>
>>>>>
>>>>>
>>>>> 1.     On “Qualitative content audit methodology by Martin &
>>>>> Hannington” this is a book that deals with everything like eye tracking of
>>>>> where a user looks at in a document.  I would like to understand exactly
>>>>> what should be in this policy without having to reference a book. Maybe we
>>>>> could reference the exaction sections we want to include instead of the
>>>>> entire book.
>>>>>
>>>> Yes, I will give you the exact page
>>>>
>>>>>  2.     Document says it will include automated systems. What
>>>>> systems? OWASP, third party applications, etc?
>>>>>
>>>> Grammar checking and plagiarism
>>>>
>>>>>  3.     Plagiarism checker? Which one and who is paying for it, OWSP
>>>>> and or project. This can get costly. If the code review takes content out
>>>>> of a cheat sheet and puts it into Code Review Guide is this plagiarism?
>>>>> Plagiarism is about ownership of content. I don’t want us to get into an
>>>>> ownership battle. Consider the turn of events, if someone takes content out
>>>>> of Code Review Guide, publishes it in a research paper or blog, and then we
>>>>> update the Code Review Guide and run plagiarism checker how would this
>>>>> situation be resolved? Remember everything OWASP does should be open
>>>>> license, hence free to copy. Free to copy causes plagiarism errors.
>>>>>
>>>> Owasp, will pay pay for this plagiarism checker. Keep in mind that we
>>>> as reviewers use these tools however in the end we still need to check the
>>>> results and confirm this information.
>>>>
>>>>> 4.     Open source, there are many open source licenses (
>>>>> http://opensource.org/licenses), Apache, GNU, etc. are we saying all
>>>>> projects have to be the same open source license? This is something I would
>>>>> encourage. Just makes life easier.
>>>>>
>>>> So far project leaders decide what kind of license they want to
>>>> provide. Maybe something to think about the future
>>>>
>>>>>  5.     Be careful about using the word “Accessibility” (
>>>>> http://en.wikipedia.org/wiki/Web_accessibility). Where I work we have
>>>>> to make sure our public facing web sites meet the US criteria of
>>>>> accessibility I really don’t want to go there with OWASP. Oh yea please
>>>>> don’t use yellow in your documents. People like me who are colorblind find
>>>>> it very non accessible.  Funny but true.
>>>>>
>>>>> a.     Mailing lists/feedback? So we want public to use the wiki for
>>>>> a feedback form or Google doc’s? Download content from wiki and go to
>>>>> Google doc’s to provide a feedback form. I would prefer one feedback form
>>>>> that a user can select a project from a drop down list along with a few
>>>>> other common questions. The same on mailing lists allow on mail list that
>>>>> can be used for every project instead of a mailing list for each project to
>>>>> provide feedback.
>>>>>
>>>> Right now feedback rer project can be provided through openduck,
>>>> formerly known as ohlo. I prefer this system since is easier to
>>>> administrate
>>>>
>>>>>  6.     I think I need a simple check off list that shows for each
>>>>> task what points that project receives for each task.
>>>>>
>>>> Agree
>>>>
>>>>>
>>>>>
>>>>> *Second phase.*
>>>>>
>>>>> OK this doesn’t need to be in a proposal, just sounds like something
>>>>> we should do.
>>>>>
>>>>>
>>>>>
>>>>> *Third phase. *
>>>>>
>>>>> 1.     Relevance, so a project gets released then based on popularity
>>>>> (downloads) it get flagship status. Isn’t that what some people are
>>>>> complaining about with OWASP Top Ten?
>>>>>
>>>> That is not the only criteria to become flagship. As explained in the
>>>> proposal, the project needs to meet different requirements
>>>>
>>>>>  2.     Formatting/Branding. Sounds good but this isn’t easy.
>>>>> Shouldn’t this be another project, I suspect you want some like “O'Reilly
>>>>> Head First” series.  I think they are a lot of good suggestions in this
>>>>> proposal but it suffers from project creep. I strongly suggest we start
>>>>> small and grow the content policy. This would need to be available to all
>>>>> projects before they start to be part of any criteria to judge that project
>>>>> on
>>>>>
>>>>
>>>> Agree, this is more of budget and money.
>>>>
>>>>>
>>>>>
>>>>> 3.     Hiring experts. Isn’t that us? I understand the intent but
>>>>> this language this can be very offending to the community.
>>>>>
>>>>>
>>>>
>>>>> IEEE hires experts to review documentation because people won't do it
>>>>> for free. This is my point. The people to review can be from our community
>>>>> but if no one takes the time to review then how are going to do this?
>>>>>
>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Aug 3, 2014 at 8:56 PM, johanna curiel curiel <
>>>>> johanna.curiel at owasp.org> wrote:
>>>>>
>>>>>> Leaders
>>>>>>
>>>>>> The following attachment contains a Proposal Evaluation methodology
>>>>>> for Documents OWASP projects.
>>>>>>  After brainstorming with some community members, we have set this
>>>>>> idea in this draft document.
>>>>>>
>>>>>> Please, we strongly encourage all project leaders to go through and
>>>>>> read the document.
>>>>>>
>>>>>> Also, I have added a Feedback form. Your opinion counts. We are
>>>>>> trying to develop a methodology and this is a proposal. Your input woudl be
>>>>>> valuable to the final version
>>>>>>
>>>>>>
>>>>>> Access to the form:
>>>>>>
>>>>>> https://docs.google.com/a/owasp.org/forms/d/1VlIFrGxogpuy_Sb-wsbXI3ToQZJw-5fQgZFy09sJQ00/viewform?c=0&w=1&usp=mail_form_link
>>>>>>
>>>>>>
>>>>>>  Powered by
>>>>>> [image: Google Forms]
>>>>>>
>>>>>> Cheers
>>>>>>
>>>>>> Johanna Curiel
>>>>>> Lead Project Task Force
>>>>>> This form was created inside of OWASP Foundation.
>>>>>> Report Abuse
>>>>>> <https://docs.google.com/forms/d/1VlIFrGxogpuy_Sb-wsbXI3ToQZJw-5fQgZFy09sJQ00/reportabuse?source=https://docs.google.com/a/owasp.org/forms/d/1VlIFrGxogpuy_Sb-wsbXI3ToQZJw-5fQgZFy09sJQ00/viewform?sid%3D460accae8c6e84c2%26c%3D0%26w%3D1%26token%3DFTXcnkcBAAA.E2ID09dDFjxrJKxgWuKamw.ALYHmlTuTSQpZJUOKMtgBw>
>>>>>> - Terms of Service <http://www.google.com/accounts/TOS> - Additional
>>>>>> Terms <http://www.google.com/google-d-s/terms.html>
>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> OWASP-Leaders mailing list
>>>>>> OWASP-Leaders at lists.owasp.org
>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>>>>>>
>>>>>>
>>>>>
>>>
>>> _______________________________________________
>>> OWASP-Leaders mailing list
>>> OWASP-Leaders at lists.owasp.org
>>> https://lists.owasp.org/mailman/listinfo/owasp-leaders
>>>
>>>
>>
>>
>> --
>> Munir Njenga,
>> OWASP Chapter Leader (Kenya) || Information Security Consultant ||
>> Developer
>> Mob   (KE) +254 (0) 734960670
>>
>> =============================
>> Chapter Page: www.owasp.org/index.php/Kenya
>> Email: munir.njiru at owasp.org
>> Facebook: https://www.facebook.com/OWASP.Kenya
>> Mailing List: https://lists.owasp.org/mailman/listinfo/owasp-Kenya
>>
>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "OWASP Projects Task Force" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to projects-task-force+unsubscribe at owasp.org.
>> To post to this group, send email to projects-task-force at owasp.org.
>> To view this discussion on the web visit
>> https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CAFshVXXWE4GyLZLK9QD5eGEoZpkgCozxaCc%2BY07NhRpriYyfRw%40mail.gmail.com
>> <https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CAFshVXXWE4GyLZLK9QD5eGEoZpkgCozxaCc%2BY07NhRpriYyfRw%40mail.gmail.com?utm_medium=email&utm_source=footer>
>> .
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20140818/c9d8f7a7/attachment-0001.html>


More information about the OWASP-Leaders mailing list