[Owasp-leaders] OWASP Mobile Top Ten 2014 - M10 Datapoints

Jim Manico jim.manico at owasp.org
Wed Nov 5 09:14:34 UTC 2014


So, if the attacker modifies their own pinned certificate in a mobile 
app, what do they accomplish? The inability to use that webservice. What 
is accomplished from a security point of view? Nothing....

- Jim

On 11/5/14 4:38 PM, Jonathan Carter wrote:
> In that particular case, the attacker will perform static analysis, 
> identify the sensitive code associated with the hardcoded data, and 
> then modify the actual data values.
>
> On Tue, Nov 4, 2014 at 11:41 PM, Jim Manico <jim.manico at owasp.org 
> <mailto:jim.manico at owasp.org>> wrote:
>
>     Certificate pinning does hard-code •secrets•, it hard-codes the
>     •public• SSL/TLS key. This is a significant difference, Jonathan.
>
>     --
>     Jim Manico
>     @Manicode
>     (808) 652-3805 <tel:%28808%29%20652-3805>
>
>     On Nov 5, 2014, at 11:38 AM, Jonathan Carter
>     <jonathan.carter at owasp.org <mailto:jonathan.carter at owasp.org>> wrote:
>
>>     While M10 does touch on digital rights management, it goes far
>>     beyond that. Here's an easy example: certificate pinning.
>>     Certificate pinning is a classic coding technique that relies
>>     upon hardcoded data.  This security control has an inherent set
>>     of other related binary vulnerabilities that would allow an
>>     attacker to completely bypass or disable your flawlessly written
>>     code.  You must make it as difficult as possible to prevent
>>     someone from modifying that hardocded data.  If they do, you've
>>     completely made your certificate pinning control irrelevant. 
>>     This is what M10 is touching on and it's something that OWASP
>>     really doesn't like to talk about or acknowledge.
>>
>>     On Tue, Nov 4, 2014 at 7:12 PM, Tim <tim.morgan at owasp.org
>>     <mailto:tim.morgan at owasp.org>> wrote:
>>
>>
>>         Hi Leaders,
>>
>>         I have brought up my concerns about M10 before and I have
>>         done a fair
>>         bit of thinking about this since then.  I think it would be
>>         useful to
>>         re-frame the discussion with some more subtle distinctions:
>>
>>
>>         0. Are all software security risks also considered business
>>         risks?
>>
>>            Yes, I would say so.  It is hard to find a computer
>>         security risk
>>            that doesn't pose some kind of business risk.
>>
>>
>>         1. Are all business risks considered security risks?
>>
>>            No, I definitely don't think so.  There are plenty of things
>>            outside of the realm of software security that are very real
>>            business risks (e.g. employees running over a business
>>         partner in
>>            the parking lot by accident).
>>
>>
>>         2. Is binary modification/repackaging a real business risk to
>>            intellectual property?
>>
>>            Yes!  It is happening already.  An attacker could
>>         repackage your
>>            app, redistribute, and reap benefits from app stores based
>>         on your
>>            hard work.
>>
>>
>>         3. How is mobile reverse engineering and/or repackaging a
>>         security
>>            risk?
>>
>>            Yes, specifically:
>>
>>            A) Reverse engineering can expose crypto keys and any
>>         other secrets
>>               that are foolishly embedded in the app.
>>
>>            B) Repackaging can be used to try and fool users into
>>         installing
>>               the wrong version of an application which has malicious
>>         intent.
>>               Very similar to phishing.
>>
>>
>>         4. Does mobile app obfuscation/monitoring/anti-reverse
>>         engineering
>>            technology help solve a *business* risk?
>>
>>            Yes, in that it raises the cost of reusing the compiled
>>         version of
>>            the software.  Raise the cost enough, and the attacker
>>         might as
>>            well write their own app.  Even if you don't raise the
>>         cost *that*
>>            high, you reduce the number of people willing to target
>>         your app
>>            specifically.
>>
>>
>>         5. Does mobile app obfuscation/monitoring/anti-reverse
>>         engineering
>>            technology help solve a *security* risk?
>>
>>            No, I don't think so.
>>
>>            Regarding (3A)-- If crypto keys/credentials/etc are
>>         valuable, it
>>            doesn't take a whole lot of effort decode an obfuscated
>>         binary to
>>            get that them.  Definitely worth the minimal effort.
>>
>>            Regarding (3B)-- If cloning apps like this is effective
>>         against
>>            users, then it's just as easy to copy the images from the
>>         company's
>>            website, slap it on a "hello world" app, add a login form, and
>>            poof: you have users' credentials.  You don't need to
>>         clone a whole
>>            app to fool users.
>>
>>
>>
>>
>>         I think many folks on each side of the discussion are correct
>>         in what
>>         they are saying, but they are talking about different
>>         things.  Look at
>>         the issue with a slightly higher resolution, particularly in the
>>         context of what attacks are actually applicable, and it all
>>         becomes
>>         much more clear:  Remove M10.  (After all, OWASP is primarily
>>         about
>>         computer security, not digital rights management.)
>>
>>
>>         Cheers,
>>         tim
>>
>>
>>     _______________________________________________
>>     OWASP-Leaders mailing list
>>     OWASP-Leaders at lists.owasp.org <mailto:OWASP-Leaders at lists.owasp.org>
>>     https://lists.owasp.org/mailman/listinfo/owasp-leaders
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.owasp.org/pipermail/owasp-leaders/attachments/20141105/b1359ae6/attachment.html>


More information about the OWASP-Leaders mailing list