[Owasp-board] 2016 ideas

Josh Sokol josh.sokol at owasp.org
Tue Nov 24 22:55:25 UTC 2015


This sounds like a reasonable approach to me.  As long as there is a
reasonable cap on the bounty program, then I would likely vote in favor of
it.

~josh

On Tue, Nov 24, 2015 at 4:06 PM, johanna curiel curiel <
johanna.curiel at owasp.org> wrote:

> Hi Micheal, Jim
>
> Thank you for the support.
> Some highlights from the discussion and which I think is important to make
> sure we implement later
>
>
>    - Create draft plan with Project leader
>    - Do a static code analysis in the SWAMP/Similar tool
>    - Verify results with Project leader
>    - If no major issues are found, deploy library in a dummy site to test
>    the scoped issue(s)
>    - Do a vulnerability scan using ZAP/BURP to see if they are able to
>    catch the scoped vulnerability (in this case CRSF attack)
>    - No bug found then proceed to setup 'scoped' bounty program for
>    finding specific CRSF vulnerabilities (bypass crypto/CRSF missing server
>    side validation)
>    - Run the bounty with a max period of 3 months or less (if scoped
>    issues are found before)
>    - If issue found, we proceed to log the issue into Github's project
>    - Leader must create warning on his wiki and github page regarding the
>    bug
>    - Allow leader to work to fix the issue for a max period of 3 months
>    - If issue is not fixed by then, project should be set as inactive or
>    'in progress'==> whether way users must be aware of the security issues
>    - In case the library is 'unbreakable' after 3 months , no money has
>    been spent in this period and we have effectively QA the library until a
>    further new version
>
> My idea is to support and execute the management of the bounty  including
> all the technical work involved in this. Jim has volunteered to support
> this process too  and all off you are invited to provide feedback.
>
> All results of all vulnerabilities scans will be published through the
> research web portal (right now jowasp.github.io)
> Updates will be published through mailing lists (community/leaders) and we
> will ask to place this in the connector too.
>
>
>
> regards
>
> Johanna
>
> On Tue, Nov 24, 2015 at 5:03 PM, Jim Manico <jim.manico at owasp.org> wrote:
>
>> Awesome. I really appreciate this Michael. I'll do my best to do the same
>> when I'm on the other side of the coin.
>>
>> Aloha,
>> --
>> Jim Manico
>> Global Board Member
>> OWASP Foundation
>> https://www.owasp.org
>> Join me in Rome for AppSecEU 2016!
>>
>> On Nov 24, 2015, at 3:51 PM, Michael Coates <michael.coates at owasp.org>
>> wrote:
>>
>> This is a long thread so I may have missed a few items. But, I still
>> support experimenting with this on a small scale. Let's try new things in a
>> reasonable way, learn and iterate.
>>
>> We scope it appropriately and have project buy in. Let's try it!
>>
>> On Monday, November 23, 2015, johanna curiel curiel <
>> johanna.curiel at owasp.org> wrote:
>>
>>> Josh
>>>
>>> >>A "pilot" means that you are testing something on a smaller scale to
>>> see if it works at a larger scale.  If you are asking for funds for
>>> CSRFGuard to check if crypto was implemented properly, and have agreement
>>> from the project leaders to fix any findings, then that sounds like money
>>> well spent, but it is not a "pilot" because the rules of engagement do not
>>> scale.
>>>
>>> Yes they do scale. The idea is that if it works for ONE library, then we
>>> can work out for MORE libraries.
>>> As a  pilot running for ONE project is to test how effective can we run
>>> a bounty program as *exhaustive testing* for security libraries and
>>> maybe consider it for OTHER security libraries. Off course , conditions
>>> should apply for participation at all cases.
>>>
>>> All I hear is a lot of critique and no constructive arguments on how
>>> this should be run. Automated scans, done, so what else?
>>>
>>> You mentioned there should be vuln automated scans. I even attached the
>>> ones we did last year and one done by the leader himself. He has been
>>> actively making fixes and working with volunteers.IF you read the report
>>> there were no major issues found by the Automated Checkmark tool. CodeDX
>>> had false positives.
>>>
>>> If you read my other emails I also put, step by step how this process
>>> should be handled.
>>>
>>> Like Jim mentioned: This is not a "high volume bounty program" like a
>>> traditional bug bounty program. This is targeted to a small number of
>>> specific issues that we want researchers hunting for in a small number of
>>> defensive libraries.
>>>
>>> The budget is limited , is just as a way to test and confirm that
>>> CRSFGuard library has been tested extensively. Any security specialist
>>> knows that trusting an automated vulnerability scan for this is wrong.
>>>
>>> I have taken the time to write a detailed DRAFT proposal and the idea is
>>> to work out a solution for testing effectively FLASGHIPS and  LAB security
>>> libraries, not just some 'automated scans' with some open source tools.
>>>
>>> If OWASP sells security but we don't even support a more rigurous QA
>>> testing of the security libraries we so proudly exposed as Flasghip
>>> projects, then , it has no sense to do these kinds of projects. If you have
>>> a betterl,  cheaper and effective way to test security libraries, please
>>> feel free to create a proposal.We all know that automated scans are a
>>> second best to human testing
>>> http://www.acunetix.com/blog/articles/scanning-vs-pen-testing/
>>>
>>> The idea is to have more extensive testing, not per se to run a bounty
>>> program. The bounty program at hackerone or bug bounty is just as a
>>> management tool and *a cheaper way to pay an exhaustive testing for QA
>>> testing these libraries*. This is the purpose of the pilot, a pilot is
>>> first of all, A STUDY or EXPERIMENT.
>>>
>>> REPORTS ATTACHED OF THE AUTOMATED VULN SCANS
>>>
>>> *A Pilot program, also called a feasibility study or experimental trial,
>>> is a small-scale, short-term experiment that helps an organisation learn
>>> how a large-scale project might work in practice.*
>>>
>>> Regards
>>>
>>> Johanna
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Nov 23, 2015 at 8:45 PM, johanna curiel curiel <
>>> johanna.curiel at owasp.org> wrote:
>>>
>>>> Matt , Josh
>>>>
>>>> I just want to remind that this is a pilot and at the same time it can
>>>> help us check the actual issues with the libraries
>>>> I think you do not nee dot agree to invest in 3 projects, 1 as CRSF
>>>> Guard will be ok.
>>>>
>>>> It is narrow, it has a very defined scope and it will only be for
>>>> specific issues found.
>>>>
>>>> The purpose is verify the actual situation of the libraries that
>>>> participate and I don't think this efforts will be wasted, on the contrary.
>>>>
>>>> Again doing security scans with SWAMP tools do not verify the
>>>> functional security part of the library as these AST/DAST tools only find
>>>> bugs at code with a lot of false positives
>>>>
>>>> I'm attaching the scans and reports we did from CRSFGuard and the scan
>>>> done by Azzedine using Checkmarks last year and I think CRSFGuard will
>>>> benefit quite a lot from a Bounty source program, also from OWASP to verify
>>>> that its flagship tools are the best they can
>>>>
>>>> Regards
>>>>
>>>> Johanna
>>>>
>>>> On Mon, Nov 23, 2015 at 7:19 PM, Jim Manico <jim.manico at owasp.org>
>>>> wrote:
>>>>
>>>>> Matt,
>>>>>
>>>>> I just want to say, one more time, that you Josh and Michael have
>>>>> expressed big concerns about an open bounty program. I agree with all of
>>>>> your concerns. It would be a huge waste to do such a thing.
>>>>>
>>>>> Thanks very kindly for reconsidering this in the context of a
>>>>> "narrowly scoped bounty program".
>>>>>
>>>>> Aloha,
>>>>> --
>>>>> Jim Manico
>>>>> Global Board Member
>>>>> OWASP Foundation
>>>>> https://www.owasp.org
>>>>> Join me in Rome for AppSecEU 2016!
>>>>>
>>>>> On Nov 24, 2015, at 1:15 AM, Matt Konda <matt.konda at owasp.org> wrote:
>>>>>
>>>>> Jim,
>>>>>
>>>>> I have read the doc.  It wasn't at all clear to me what the full scope
>>>>> of the bounties would be, I thought those were just a few examples.
>>>>>
>>>>> I'll stew on it some more ...
>>>>>
>>>>> Matt
>>>>>
>>>>>
>>>>> On Mon, Nov 23, 2015 at 4:42 PM, Jim Manico <jim.manico at owasp.org>
>>>>> wrote:
>>>>>
>>>>>> > Otherwise, you end up paying bounties for things that are really
>>>>>> easy to find and look bad along the way.
>>>>>>
>>>>>> Fair comments, Matt.
>>>>>>
>>>>>> Please read Johannas doc. The various bounties I'm suggesting are
>>>>>> very narrowly focused and scoped.
>>>>>>
>>>>>> I agree with your comments if we were to conduct an "open bounty" but
>>>>>> that is not what is being scoped.
>>>>>>
>>>>>> For example, I suggest for CSRF Guard we bounty:
>>>>>> 1) Is the token generation weak crypto (predictable, etc)
>>>>>> 2) Is the server side token verifier not properly verifying an active
>>>>>> and propely configured endpoint.
>>>>>>
>>>>>> etc.
>>>>>>
>>>>>> By working together with project owners and security leaders, we
>>>>>> should be able to define very clear rules of play so our financial
>>>>>> expenditures on these issues are clear and useful.
>>>>>>
>>>>>> --
>>>>>> Jim Manico
>>>>>> Global Board Member
>>>>>> OWASP Foundation
>>>>>> https://www.owasp.org
>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>
>>>>>> On Nov 23, 2015, at 5:44 PM, Matt Konda <matt.konda at owasp.org> wrote:
>>>>>>
>>>>>> I have also run bounty programs and I would take what Michael said
>>>>>> and go a bit further.
>>>>>>
>>>>>> I am skeptical of this approach for OWASP at this time.  Generally, I
>>>>>> support bounty programs when an organization or project is *mature*
>>>>>> and basic controls have already been applied.  This implies that tools have
>>>>>> been run, code has been reviewed and there are things in the SDLC that
>>>>>> allow you to respond in a controlled manner (eg. regular releases,
>>>>>> resources available for quick response and a solid bug triage process).
>>>>>>
>>>>>> Otherwise, you end up paying bounties for things that are really easy
>>>>>> to find and look bad along the way.  Also, we'll spend a lot of time
>>>>>> dealing with communications and researchers who are only sometimes good to
>>>>>> work with.  I believe that if we spent the time (and money) on the project
>>>>>> controls instead of communicating with researchers, we would likely have
>>>>>> better initial results.
>>>>>>
>>>>>> I don't have access to edit or comment upon the doc, but I would
>>>>>> definitely push for:
>>>>>>
>>>>>>    - Limits on scanner findings
>>>>>>    - Limits on DoS category findings
>>>>>>
>>>>>> In summary, I think there are alternative approaches that we should
>>>>>> take on before doing a bounty program - including systematically engaging
>>>>>> with volunteers to do security audits, engaging with vendors to get their
>>>>>> feedback, and building security checkpoints into the project maturity
>>>>>> process.
>>>>>>
>>>>>> I applaud the general goal and aspiration of ensuring rigorous
>>>>>> security in our projects.
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>> On Mon, Nov 23, 2015 at 7:15 AM, johanna curiel curiel <
>>>>>> johanna.curiel at owasp.org> wrote:
>>>>>>
>>>>>>> Hi All,
>>>>>>>
>>>>>>> The idea is to finalised working out the items scope for each
>>>>>>> project. Like mentioned in the proposal, I think is a good idea to take 3
>>>>>>> projects (Flagship,Lab,Incubator) and workout through a series of items for
>>>>>>> each one.
>>>>>>>
>>>>>>> From Jim's experience in defenders library, we can help define clear
>>>>>>> scopes and with the project leaders input also, we need to define the most
>>>>>>> important issues that will make the libraries weak on protection
>>>>>>>
>>>>>>> A set of steps here
>>>>>>>
>>>>>>>    - Finalise the items within the scope for each project part of
>>>>>>>    the pilot
>>>>>>>    - Touch based with the project leaders regarding their
>>>>>>>    availability for feedback
>>>>>>>    - Deploy a couple of websites (simple ones) that have the
>>>>>>>    libraries configured
>>>>>>>    - Deploy on a VM server and host a couple of domains for them
>>>>>>>    for the researchers to test example:
>>>>>>>       - crsfguard.owasp.org
>>>>>>>       - xxxx.owasp.org
>>>>>>>    - Setup  and configure an account in hackerone.com
>>>>>>>    - Setup and configure an account bountysource.com
>>>>>>>
>>>>>>>
>>>>>>> From there on is a management of the process, as researchers will
>>>>>>> submit issues and we need to re-test them and validate them for this part
>>>>>>>
>>>>>>>    - Researcher submits issue
>>>>>>>    - retest and verify the severity (Jim, me , Project leader)
>>>>>>>    - feedback with Jim and Project leaders regarding the severity
>>>>>>>    of the issue, all reports are available in hacker one as a
>>>>>>>    Hacker/researcher submits the issue, if possible more people should retest
>>>>>>>    to confirm the bug
>>>>>>>    - determine together if indeed a bug has been confirmed ( Jim,
>>>>>>>     the project leader and me)
>>>>>>>    - Once confirmed, log the bug in the projects Github
>>>>>>>    - Make the payment to the researcher with approval of 3 members
>>>>>>>    ( Jim,  the project leader and me)
>>>>>>>    - Publicise the bug issue through wiki project page, github
>>>>>>>    issues section and eventually if it is high risk, the leader will have to
>>>>>>>    set a warning on its Github repository
>>>>>>>    - Project leader must provide feedback on his availability with
>>>>>>>    the projects volunteers on fixing the issue
>>>>>>>    - If this is not fixable or too complex for the leader to fix,
>>>>>>>    we attempt a bountysource.com fix
>>>>>>>    - If someone fixes the issue through bountysource.com, it must
>>>>>>>    be retested and confirmed
>>>>>>>    - Can run again in hackerone to verify that specific bug fix
>>>>>>>    after deployment (payment amount to be determined)
>>>>>>>    - The fix must be merged into the master branch of the
>>>>>>>    repository and  incorporated into a new release
>>>>>>>    - It project is updated with fix
>>>>>>>
>>>>>>>
>>>>>>> *Conditions for the participating owasp project:*
>>>>>>> Must have an active leader we can feedback with regarding the issues
>>>>>>> found. I think this is essential for the completeness of the project's
>>>>>>> definition of bugs and later fixes
>>>>>>>
>>>>>>> These are just some draft steps part of the proposal and process, if
>>>>>>> you have any advice regarding this, please let us know
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>> Johanna
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Nov 23, 2015 at 3:48 AM, Jim Manico <jim.manico at owasp.org>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Thanks Michael and Josh for your astute feedback. Johanna already
>>>>>>>> has a decent proposal on the block and I've provided several edits and
>>>>>>>> issues to consider.
>>>>>>>>
>>>>>>>> If you have time, would love to have you both mark up the doc
>>>>>>>> directly with your comments and thoughts. :)
>>>>>>>>
>>>>>>>> Thanks and Aloha,
>>>>>>>> --
>>>>>>>> Jim Manico
>>>>>>>> Global Board Member
>>>>>>>> OWASP Foundation
>>>>>>>> https://www.owasp.org
>>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>>>
>>>>>>>> On Nov 23, 2015, at 1:42 AM, Michael Coates <
>>>>>>>> michael.coates at owasp.org> wrote:
>>>>>>>>
>>>>>>>> Awesome stuff. I've run teams on the receiving end of bug bounties
>>>>>>>> at both Mozilla and Twitter. Happy to provide any feedback if helpful. Key
>>>>>>>> items are good and fast responses to researchers and actually closing valid
>>>>>>>> issues in a timely manner. If we can't commit to those items we'd
>>>>>>>> want to reconsider our approach.
>>>>>>>>
>>>>>>>> On Sunday, November 22, 2015, Jim Manico <jim.manico at owasp.org>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> We're working on a proposal and plan right now. More soon.
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Jim Manico
>>>>>>>>> Global Board Member
>>>>>>>>> OWASP Foundation
>>>>>>>>> https://www.owasp.org
>>>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>>>>
>>>>>>>>> On Nov 22, 2015, at 12:57 PM, Josh Sokol <josh.sokol at owasp.org>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> I like the concept, but have some questions before the Board were
>>>>>>>>> to approve something like this:
>>>>>>>>>
>>>>>>>>>    1. Is there an actual proposal to fund a Bug Bounty?  If so,
>>>>>>>>>    what is the dollar amount that the Board would be authorizing here?
>>>>>>>>>    2. A bug bounty program is more than just a dollar amount,
>>>>>>>>>    it's a process.  Have we created a process for handling any submissions
>>>>>>>>>    that come in for bugs?
>>>>>>>>>    3. Once you have a submission, are we just throwing it in a
>>>>>>>>>    database somewhere or is there an expectation that someone will fix it?
>>>>>>>>>    Who is responsible for that?
>>>>>>>>>    4. If the answer to #3 is the project team, then what happens
>>>>>>>>>    if they do not fix it in a timely manner?  Is the project demoted?  If the
>>>>>>>>>    bug is serious enough, do we halt all downloads of the project until it is
>>>>>>>>>    fixed?  Do we attempt to warn users?
>>>>>>>>>
>>>>>>>>> In short, I think it's great to say "We want a bug bounty program
>>>>>>>>> like Hackerone", but there are way more details that need to be hashed out
>>>>>>>>> here.  I recommend putting together a team to assess how this would work as
>>>>>>>>> part of an actual process for OWASP.  I wouldn't be comfortable authorizing
>>>>>>>>> any funds until I had that information.
>>>>>>>>>
>>>>>>>>> ~josh
>>>>>>>>>
>>>>>>>>> On Sun, Nov 22, 2015 at 12:12 AM, johanna curiel curiel <
>>>>>>>>> johanna.curiel at owasp.org> wrote:
>>>>>>>>>
>>>>>>>>>> To run a programma like hackerone we will need to verify the bugs
>>>>>>>>>> found
>>>>>>>>>> We could start with a pilot For CRSFGuard and Dependency Check
>>>>>>>>>> projects
>>>>>>>>>> I volunteer to manage the programma For these projects
>>>>>>>>>> I can set a plan to determine the scope of the program with The
>>>>>>>>>> project leaders and make sure we verify the
>>>>>>>>>> veracity of the reported bugs
>>>>>>>>>>
>>>>>>>>>> What does The board need from me in order to approve my proposal?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Saturday, November 21, 2015, Michael Coates <
>>>>>>>>>> michael.coates at owasp.org> wrote:
>>>>>>>>>>
>>>>>>>>>>> "I would say that for the existing Flagship & LABS (libraries
>>>>>>>>>>> or code) we should run a program through Hackerone or Bugbounty.(off course
>>>>>>>>>>> insecure applications as WebGoat are out of scope ;-))"
>>>>>>>>>>>
>>>>>>>>>>> Yes. This would generate awareness, generate opportunities for
>>>>>>>>>>> new volunteers and put a better control around our prominent code.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> Michael Coates | @_mwc
>>>>>>>>>>> <https://twitter.com/intent/user?screen_name=_mwc>
>>>>>>>>>>> OWASP Global Board
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sat, Nov 21, 2015 at 8:34 AM, johanna curiel curiel <
>>>>>>>>>>> johanna.curiel at owasp.org> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Jim & Board
>>>>>>>>>>>>
>>>>>>>>>>>> 'Developers come to us'... is indeed a moderate approach. I
>>>>>>>>>>>> just finalised a security project reviews developed by very serious
>>>>>>>>>>>> companies in EU and it amazes me that they were using CRSFGuard and even
>>>>>>>>>>>> ESAPI.
>>>>>>>>>>>>
>>>>>>>>>>>> There is a dependency and the reason why the PHPSEC users were
>>>>>>>>>>>> angry at OWASP, they were using the project for some serious development of
>>>>>>>>>>>> financial applications and counting on OWASP to secure them.
>>>>>>>>>>>>
>>>>>>>>>>>> Since OWASP cannot offer a QA process review of its own
>>>>>>>>>>>> projects, we should be careful here and indeed, the approach to help
>>>>>>>>>>>> improve existing frameworks is more realistic and has less risks associated
>>>>>>>>>>>> with reputation issues to OWASP image
>>>>>>>>>>>>
>>>>>>>>>>>> I would say that for the existing Flagship & LABS (libraries or
>>>>>>>>>>>> code) we should run a program through Hackerone or Bugbounty.(off course
>>>>>>>>>>>> insecure applications as WebGoat are out of scope ;-))
>>>>>>>>>>>>
>>>>>>>>>>>> Again, maybe the focus should stop in trying to create
>>>>>>>>>>>> libraries as Tim said but focus the efforts into working on existing
>>>>>>>>>>>> frameworks.
>>>>>>>>>>>>
>>>>>>>>>>>> The reality is that creating security libraries is VERY hard
>>>>>>>>>>>> and it has a lot of consequences for OWASP image if serious issues are
>>>>>>>>>>>> found as the case of PHPSEC
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>> Johanna
>>>>>>>>>>>>
>>>>>>>>>>>> On Sat, Nov 21, 2015 at 11:55 AM, Jim Manico <
>>>>>>>>>>>> jim.manico at owasp.org> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Folks,
>>>>>>>>>>>>>
>>>>>>>>>>>>> I'm feeling a bit more clarity on suggesting technical
>>>>>>>>>>>>> resource hires for 2016. Paul, these are just ideas to trigger strategic
>>>>>>>>>>>>> planning discussions and ideas. I agree that the final decisions around
>>>>>>>>>>>>> these hires is "all you".  I hope this email is taken in the spirt of
>>>>>>>>>>>>> "ideas to consider".
>>>>>>>>>>>>>
>>>>>>>>>>>>> 1) Wiki experts (previously discussed)
>>>>>>>>>>>>> 2) Web design expert (previously discussed)
>>>>>>>>>>>>> 3) Technical contractor or bounties to help augment the
>>>>>>>>>>>>> security of common software frameworks (big potential here)
>>>>>>>>>>>>> 4) Security assurance contractors or bounties to help review
>>>>>>>>>>>>> OWASP defensive projects
>>>>>>>>>>>>>
>>>>>>>>>>>>> The whole "developers, come to us" is only modestly effective.
>>>>>>>>>>>>> "Developers, we want to help and go to you" is a much more effective
>>>>>>>>>>>>> movement, IMO.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Thinking a bit out of box here... If we spent significant
>>>>>>>>>>>>> funds in helping improve common software frameworks for security - we could
>>>>>>>>>>>>> really have a massive impact on the world at large. I'd love to see serious
>>>>>>>>>>>>> investment in this area....
>>>>>>>>>>>>>
>>>>>>>>>>>>> Aloha,
>>>>>>>>>>>>> --
>>>>>>>>>>>>> Jim Manico
>>>>>>>>>>>>> Global Board Member
>>>>>>>>>>>>> OWASP Foundation
>>>>>>>>>>>>> https://www.owasp.org
>>>>>>>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>>>>>>>>
>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> Owasp-board mailing list
>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>> --
>>>>>>>> Michael Coates | @_mwc
>>>>>>>> <https://twitter.com/intent/user?screen_name=_mwc>
>>>>>>>> OWASP Global Board
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> Owasp-board mailing list
>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> Owasp-board mailing list
>>>>>>> Owasp-board at lists.owasp.org
>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>> --
>>
>> --
>> Michael Coates | @_mwc <https://twitter.com/intent/user?screen_name=_mwc>
>> OWASP Global Board
>>
>>
>>
>>
>>
>>
>
> _______________________________________________
> Owasp-board mailing list
> Owasp-board at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-board
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-board/attachments/20151124/87bc16c2/attachment-0001.html>


More information about the Owasp-board mailing list