[Owasp-board] 2016 ideas

Jim Manico jim.manico at owasp.org
Mon Nov 23 22:57:43 UTC 2015


Sadly you cannot throw a scanner at a web app to get real assurance. Folks that do this are called "scanner jockeys" by the industry (ie: ride a scanner to look for assurance - this failed 10 years ago let alone today). Zaps auto-scanner is nice, but Zap is best used in the hands of an expert doing manual testing.

Manual expert review is tough for a webapp and even more difficult for a defensive library. Targeted expert review is key here. Bountys are one way to approach this but we're open to other suggestions...

--
Jim Manico
Global Board Member
OWASP Foundation
https://www.owasp.org
Join me in Rome for AppSecEU 2016!

> On Nov 23, 2015, at 9:31 PM, Josh Sokol <josh.sokol at owasp.org> wrote:
> 
> I think the problem is that we're looking at adding an additional maturity step to a process that has little to no maturity in this area currently.  In my opinion, a project should NEVER pass an incubator state if the code isn't secure.  Yet, we have no process or enforcement of that today.  Let's say we skip this step and go straight to the bug bounties.  What happens?  We get reports of a bunch of XSS, CSRF, SQLi, and other similar issues that could have been found by a scanner.  Now, we pay out a bounty for the person who found it (when we have tools that could have done it if we only took the time) and we still have to figure out who is responsible for fixing the issue.  Without a commitment from the project team to fix issues that are found, this effort goes nowhere.  I would like to support this.  Truly, I would, but I think we're missing some pre-steps here.  You want concrete actions?  Fair enough.
> Require project teams to scan their applications in the SWAMP (or other resource deemed acceptable by the Project Committee) within the next 90 days.
> Require that any high or critical issues that are found get remediated within the next 90 days.
> Demote all projects that do not follow steps 1 & 2 within the next 90 days back to incubator status.  Once complete, they can get their old status back.
> Once a project has completed the requirements above, then we can start talking about bug bounties.  This should be an "opt-in" model for the projects where they commit to fixing issues within a certain timeframe.  Most of the technical vulnerabilities that matter will be fixed and the "hackers" will have to actually work to find issues, rather than just point a scanner, something we can do ourselves.
> At that point, I trust the process that you and Jim are working on putting together, but I feel like, for the reasons you outlined below, if this gets put on one person's shoulders it is doomed to failure.  OWASP can provide tools, a platform, and associated processes to support these activities, but we need a firm commitment from the project leaders of fixing things in order for it to be successful.  Let's cover the basics first and then figure out how to do more.
> 
> ~josh
> 
> 
>> On Mon, Nov 23, 2015 at 11:44 AM, johanna curiel curiel <johanna.curiel at owasp.org> wrote:
>> 
>> 
>>> On Mon, Nov 23, 2015 at 1:33 PM, johanna curiel curiel <johanna.curiel at owasp.org> wrote:
>>> > I went back to look at the different project levels (incubator, labs, and flagship) and nowhere in there do I see a metric of "Code has been scanned for security flaws and all Critical and High level issues have been fixed"
>>> 
>>> Exactly however let me remind you this has happened in the past. I did a scan using the SWAMP when we did a major review back in 2014. It was never set as a requirement for Libraries and I do agree that needs to be set as part of the criteria, (check print screen)
>>> 
>>> BTW when I did the scan I found issues I submitted to CRSGguard and Azzedine did his own scan also using some heavy tools static analysis tools. He also made those fixes but this is not compare to a heavier bounty program.
>>> 
>>> Please keep in mind this was a volunteer effort I did in conjunction with a paid tester when we did a major review of Flagship and labs back in 2014. But again, this was volunteer basis and if I don't do it , no one is enforcing and doing that neither. very few volunteers take the time to review at this level. Timo Goosen was one of the few that helped me at this level (downloading, checking the deployment etc) . 
>>> 
>>> You will depend on volunteers to do this security scan,so far this initiative was no sustainable because no one is doing except me ;-P, but I think we can begin setting that as requirement (do at least an automated scan with static analysis tool) and provide a report, but yes..who is going to follow up that with leaders? You need a plan for this.
>>> 
>>> I feel you all have some valid arguments but do not come with concrete actions. Problem with volunteers efforts is that is not sustainable in the long run. I stopped using the SWAMP, who is using it and repotting the issues? So far I know, only me in the past.
>>> 
>>> My plan is concrete and so far everything I proposed is concrete from idea to action. To make any program sustainable I think and I grew with Jim that you need more technical people that understands projects as do this work on this regularly or as part of an initiative.
>>> 
>>> Regards
>>> 
>>> Johanna
>>> 
>>>> On Mon, Nov 23, 2015 at 1:15 PM, Josh Sokol <josh.sokol at owasp.org> wrote:
>>>> I think that Matt makes a lot of good points here and it helps to solidify the concerns that I had with this.  I went back to look at the different project levels (incubator, labs, and flagship) and nowhere in there do I see a metric of "Code has been scanned for security flaws and all Critical and High level issues have been fixed".  If, as an organization, our mission is to make software more secure, then shouldn't we start by getting our house in order first and foremost?  Personally, I think this should be mandatory criteria for all labs and flagship projects.  Even something as simple as "scan with ZAP" could help to identify issues in projects.  But, it doesn't stop there, once identified, we need a commitment from the project leaders to address the issues found in a timely manner.  If we can show that this process is in place and working, then I feel like the next step in our maturity could be some sort of a bug bounty program.  
>>>> 
>>>> ~josh
>>>> 
>>>>> On Mon, Nov 23, 2015 at 9:44 AM, Matt Konda <matt.konda at owasp.org> wrote:
>>>>> I have also run bounty programs and I would take what Michael said and go a bit further.
>>>>> 
>>>>> I am skeptical of this approach for OWASP at this time.  Generally, I support bounty programs when an organization or project is mature and basic controls have already been applied.  This implies that tools have been run, code has been reviewed and there are things in the SDLC that allow you to respond in a controlled manner (eg. regular releases, resources available for quick response and a solid bug triage process).
>>>>> 
>>>>> Otherwise, you end up paying bounties for things that are really easy to find and look bad along the way.  Also, we'll spend a lot of time dealing with communications and researchers who are only sometimes good to work with.  I believe that if we spent the time (and money) on the project controls instead of communicating with researchers, we would likely have better initial results.
>>>>> 
>>>>> I don't have access to edit or comment upon the doc, but I would definitely push for: 
>>>>> Limits on scanner findings
>>>>> Limits on DoS category findings
>>>>> In summary, I think there are alternative approaches that we should take on before doing a bounty program - including systematically engaging with volunteers to do security audits, engaging with vendors to get their feedback, and building security checkpoints into the project maturity process.
>>>>> 
>>>>> I applaud the general goal and aspiration of ensuring rigorous security in our projects.
>>>>> 
>>>>> Matt
>>>>> 
>>>>> 
>>>>>> On Mon, Nov 23, 2015 at 7:15 AM, johanna curiel curiel <johanna.curiel at owasp.org> wrote:
>>>>>> Hi All,
>>>>>> 
>>>>>> The idea is to finalised working out the items scope for each project. Like mentioned in the proposal, I think is a good idea to take 3 projects (Flagship,Lab,Incubator) and workout through a series of items for each one.
>>>>>> 
>>>>>> From Jim's experience in defenders library, we can help define clear scopes and with the project leaders input also, we need to define the most important issues that will make the libraries weak on protection 
>>>>>> 
>>>>>> A set of steps here
>>>>>> Finalise the items within the scope for each project part of the pilot
>>>>>> Touch based with the project leaders regarding their availability for feedback
>>>>>> Deploy a couple of websites (simple ones) that have the libraries configured
>>>>>> Deploy on a VM server and host a couple of domains for them for the researchers to test example:
>>>>>> crsfguard.owasp.org
>>>>>> xxxx.owasp.org
>>>>>> Setup  and configure an account in hackerone.com
>>>>>> Setup and configure an account bountysource.com
>>>>>> 
>>>>>> From there on is a management of the process, as researchers will submit issues and we need to re-test them and validate them for this part
>>>>>> Researcher submits issue
>>>>>> retest and verify the severity (Jim, me , Project leader)
>>>>>> feedback with Jim and Project leaders regarding the severity of the issue, all reports are available in hacker one as a Hacker/researcher submits the issue, if possible more people should retest to confirm the bug
>>>>>> determine together if indeed a bug has been confirmed ( Jim,  the project leader and me)
>>>>>> Once confirmed, log the bug in the projects Github
>>>>>> Make the payment to the researcher with approval of 3 members ( Jim,  the project leader and me)
>>>>>> Publicise the bug issue through wiki project page, github issues section and eventually if it is high risk, the leader will have to set a warning on its Github repository
>>>>>> Project leader must provide feedback on his availability with the projects volunteers on fixing the issue
>>>>>> If this is not fixable or too complex for the leader to fix, we attempt a bountysource.com fix
>>>>>> If someone fixes the issue through bountysource.com, it must be retested and confirmed 
>>>>>> Can run again in hackerone to verify that specific bug fix after deployment (payment amount to be determined)
>>>>>> The fix must be merged into the master branch of the repository and  incorporated into a new release
>>>>>> It project is updated with fix
>>>>>> 
>>>>>> Conditions for the participating owasp project:
>>>>>> Must have an active leader we can feedback with regarding the issues found. I think this is essential for the completeness of the project's definition of bugs and later fixes
>>>>>> 
>>>>>> These are just some draft steps part of the proposal and process, if you have any advice regarding this, please let us know
>>>>>> 
>>>>>> regards
>>>>>> 
>>>>>> Johanna
>>>>>> 
>>>>>> 
>>>>>>> On Mon, Nov 23, 2015 at 3:48 AM, Jim Manico <jim.manico at owasp.org> wrote:
>>>>>>> Thanks Michael and Josh for your astute feedback. Johanna already has a decent proposal on the block and I've provided several edits and issues to consider.
>>>>>>> 
>>>>>>> If you have time, would love to have you both mark up the doc directly with your comments and thoughts. :)
>>>>>>> 
>>>>>>> Thanks and Aloha,
>>>>>>> --
>>>>>>> Jim Manico
>>>>>>> Global Board Member
>>>>>>> OWASP Foundation
>>>>>>> https://www.owasp.org
>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>> 
>>>>>>>> On Nov 23, 2015, at 1:42 AM, Michael Coates <michael.coates at owasp.org> wrote:
>>>>>>>> 
>>>>>>>> Awesome stuff. I've run teams on the receiving end of bug bounties at both Mozilla and Twitter. Happy to provide any feedback if helpful. Key items are good and fast responses to researchers and actually closing valid issues in a timely manner. If we can't commit to those items we'd want to reconsider our approach. 
>>>>>>>> 
>>>>>>>>> On Sunday, November 22, 2015, Jim Manico <jim.manico at owasp.org> wrote:
>>>>>>>>> We're working on a proposal and plan right now. More soon.
>>>>>>>>> 
>>>>>>>>> --
>>>>>>>>> Jim Manico
>>>>>>>>> Global Board Member
>>>>>>>>> OWASP Foundation
>>>>>>>>> https://www.owasp.org
>>>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>>>> 
>>>>>>>>>> On Nov 22, 2015, at 12:57 PM, Josh Sokol <josh.sokol at owasp.org> wrote:
>>>>>>>>>> 
>>>>>>>>>> I like the concept, but have some questions before the Board were to approve something like this:
>>>>>>>>>> Is there an actual proposal to fund a Bug Bounty?  If so, what is the dollar amount that the Board would be authorizing here?
>>>>>>>>>> A bug bounty program is more than just a dollar amount, it's a process.  Have we created a process for handling any submissions that come in for bugs?
>>>>>>>>>> Once you have a submission, are we just throwing it in a database somewhere or is there an expectation that someone will fix it?  Who is responsible for that?
>>>>>>>>>> If the answer to #3 is the project team, then what happens if they do not fix it in a timely manner?  Is the project demoted?  If the bug is serious enough, do we halt all downloads of the project until it is fixed?  Do we attempt to warn users?
>>>>>>>>>> In short, I think it's great to say "We want a bug bounty program like Hackerone", but there are way more details that need to be hashed out here.  I recommend putting together a team to assess how this would work as part of an actual process for OWASP.  I wouldn't be comfortable authorizing any funds until I had that information.
>>>>>>>>>> 
>>>>>>>>>> ~josh
>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>>>> On Sun, Nov 22, 2015 at 12:12 AM, johanna curiel curiel <johanna.curiel at owasp.org> wrote:
>>>>>>>>>>> To run a programma like hackerone we will need to verify the bugs found
>>>>>>>>>>> We could start with a pilot For CRSFGuard and Dependency Check projects
>>>>>>>>>>> I volunteer to manage the programma For these projects
>>>>>>>>>>> I can set a plan to determine the scope of the program with The project leaders and make sure we verify the
>>>>>>>>>>> veracity of the reported bugs
>>>>>>>>>>> 
>>>>>>>>>>> What does The board need from me in order to approve my proposal?
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>>> On Saturday, November 21, 2015, Michael Coates <michael.coates at owasp.org> wrote:
>>>>>>>>>>>> "I would say that for the existing Flagship & LABS (libraries or code) we should run a program through Hackerone or Bugbounty.(off course insecure applications as WebGoat are out of scope ;-))"
>>>>>>>>>>>> 
>>>>>>>>>>>> Yes. This would generate awareness, generate opportunities for new volunteers and put a better control around our prominent code.
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> --
>>>>>>>>>>>> Michael Coates | @_mwc
>>>>>>>>>>>> OWASP Global Board
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>>> On Sat, Nov 21, 2015 at 8:34 AM, johanna curiel curiel <johanna.curiel at owasp.org> wrote:
>>>>>>>>>>>>> Hi Jim & Board
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 'Developers come to us'... is indeed a moderate approach. I just finalised a security project reviews developed by very serious companies in EU and it amazes me that they were using CRSFGuard and even ESAPI.
>>>>>>>>>>>>> 
>>>>>>>>>>>>> There is a dependency and the reason why the PHPSEC users were angry at OWASP, they were using the project for some serious development of financial applications and counting on OWASP to secure them. 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Since OWASP cannot offer a QA process review of its own projects, we should be careful here and indeed, the approach to help improve existing frameworks is more realistic and has less risks associated with reputation issues to OWASP image 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> I would say that for the existing Flagship & LABS (libraries or code) we should run a program through Hackerone or Bugbounty.(off course insecure applications as WebGoat are out of scope ;-))
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Again, maybe the focus should stop in trying to create libraries as Tim said but focus the efforts into working on existing frameworks. 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> The reality is that creating security libraries is VERY hard and it has a lot of consequences for OWASP image if serious issues are found as the case of PHPSEC
>>>>>>>>>>>>> 
>>>>>>>>>>>>> regards
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Johanna
>>>>>>>>>>>>> 
>>>>>>>>>>>>>> On Sat, Nov 21, 2015 at 11:55 AM, Jim Manico <jim.manico at owasp.org> wrote:
>>>>>>>>>>>>>> Folks,
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> I'm feeling a bit more clarity on suggesting technical resource hires for 2016. Paul, these are just ideas to trigger strategic planning discussions and ideas. I agree that the final decisions around these hires is "all you".  I hope this email is taken in the spirt of "ideas to consider".
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> 1) Wiki experts (previously discussed)
>>>>>>>>>>>>>> 2) Web design expert (previously discussed)
>>>>>>>>>>>>>> 3) Technical contractor or bounties to help augment the security of common software frameworks (big potential here)
>>>>>>>>>>>>>> 4) Security assurance contractors or bounties to help review OWASP defensive projects
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> The whole "developers, come to us" is only modestly effective. "Developers, we want to help and go to you" is a much more effective movement, IMO.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Thinking a bit out of box here... If we spent significant funds in helping improve common software frameworks for security - we could really have a massive impact on the world at large. I'd love to see serious investment in this area....
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Aloha,
>>>>>>>>>>>>>> --
>>>>>>>>>>>>>> Jim Manico
>>>>>>>>>>>>>> Global Board Member
>>>>>>>>>>>>>> OWASP Foundation
>>>>>>>>>>>>>> https://www.owasp.org
>>>>>>>>>>>>>> Join me in Rome for AppSecEU 2016!
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>>> 
>>>>>>>>>>> _______________________________________________
>>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>>>> 
>>>>>>>>>> _______________________________________________
>>>>>>>>>> Owasp-board mailing list
>>>>>>>>>> Owasp-board at lists.owasp.org
>>>>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>>>> 
>>>>>>>> 
>>>>>>>> -- 
>>>>>>>> 
>>>>>>>> --
>>>>>>>> Michael Coates | @_mwc
>>>>>>>> OWASP Global Board
>>>>>>> 
>>>>>>> _______________________________________________
>>>>>>> Owasp-board mailing list
>>>>>>> Owasp-board at lists.owasp.org
>>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>>> 
>>>>>> 
>>>>>> _______________________________________________
>>>>>> Owasp-board mailing list
>>>>>> Owasp-board at lists.owasp.org
>>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
>>>>> 
>>>>> 
>>>>> _______________________________________________
>>>>> Owasp-board mailing list
>>>>> Owasp-board at lists.owasp.org
>>>>> https://lists.owasp.org/mailman/listinfo/owasp-board
> 
> _______________________________________________
> Owasp-board mailing list
> Owasp-board at lists.owasp.org
> https://lists.owasp.org/mailman/listinfo/owasp-board
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-board/attachments/20151124/2f663938/attachment-0001.html>


More information about the Owasp-board mailing list