The curious tale of a fake Carrier.app

(googleprojectzero.blogspot.com)

154 points | by mfrw 1054 days ago

7 comments

  • alin23 1054 days ago
    I wish I had the expertise to do such in-depth reverse engineering of firmware blobs.

    The DCP is actually the thing that's stopping me from providing native brightness control on the HDMI port of the newer Macs inside Lunar (https://lunar.fyi). Users have to either switch to a Thunderbolt port to get native brightness control for their monitor, or use a software dimming solution like Gamma Table alteration.

    It's not clear what's going on, but it seems that the HDMI port of the 2018+ Macs uses an MCDP29xx chip inside, which converts the HDMI signal to DisplayPort internally, so that Apple doesn't have to decode both HDMI and DP video signals. (that is also the reason why even the newest MacBook and Mac Studio have only HDMI 2.0, that's the most the converter chip supports [0])

    When sending DDC commands through the IOAVServiceWriteI2C call, monitors connected to the HDMI port lose video signal, or flicker or completely crash and need a power cycle to get them back.

    The Thunderbolt ports however send the DDC command as expected when IOAVServiceWriteI2C is called

    After @marcan42 from Asahi Linux pointed out [1] the DCPAVFamilyProxy kexts, I've looked into it and I found some different writei2c methods and some MCDP29xx specific code, but no clue on how to call them from userspace.

    I guess I'll have to look into how the analysed exploit is using the RPC, and also check the methods assembly from inside the firmware blob itself. I was not aware that most userspace methods are now shims for remotely calling the embedded code.

    [0] https://www.kinet-ic.com/mcdp2900/

    [1] https://twitter.com/marcan42/status/1483283365407371265

    • saagarjha 1054 days ago
      Are there user clients exposed for those kexts?
  • cmeacham98 1054 days ago
    Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.

    In reality it costs the bad guys $299 to bypass this limitation, while your average user is locked out of this feature.

    • joshstrange 1054 days ago
      Installing an enterprise app requires the user trust the cert (with a scary warning shown). Also this makes a better case for not allowing sideloading since the sandboxing isn't perfect but the app store review process makes it harder to sneak one by.

      > In reality it costs the bad guys $299 to bypass this limitation

      And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.

      • cmeacham98 1054 days ago
        > Also this makes a better case for not allowing sideloading since the sandboxing isn't perfect but the app store review process makes it harder to sneak one by.

        The point I'm trying to make is that Apple isn't consistent here. If they actually believed this to be true there would be no way to get a sideloading cert that worked on all devices.

        "Sideloading is dangerous because users could install malware" and "We'll let any iPhone sideload your app if you jump through some hoops and pay us" are incompatible statements, but Apple makes both of them.

        > And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.

        Obviously not, but the linked post demonstrates attackers are more than capable of getting one.

        • joshstrange 1054 days ago
          But surely you see the difference between "revokable enterprise cert that requires a verification process to obtain" and "anyone can sideload"?

          I don't see this as Apple making incompatible statements.

          • cmeacham98 1054 days ago
            Of course, I'll admit it will stop small-time attackers that aren't willing to pay $299, register a fake company, and whatever else is needed to trick Apple into letting you into the enterprise dev program. But I'd like to propose these attackers wouldn't have been able to bypass the sandbox anyways.

            I'm sure the fact this setup forces all apps to go through Apple's app store and pay them 30% of their revenue is just a unfortunate accident.

            My point is that if Apple truly believed sideloading was a risk to user security there wouldn't be certificates that let you sideload on all devices. They would force companies to provide a list of devices or only allow devices enrolled in their MDM or some similar reasonable restriction.

            • acdha 1054 days ago
              > I'm sure the fact this setup forces all apps to go through Apple's app store and pay them 30% of their revenue is just a unfortunate accident.

              Does a normal enterprise charge its users to install internal apps? If not, this seems like an odd complaint.

              > My point is that if Apple truly believed sideloading was a risk to user security there wouldn't be certificates that let you sideload on all devices. They would force companies to provide a list of devices or only allow devices enrolled in their MDM or some similar reasonable restriction.

              That last part is effectively what they're doing: you have to do the same level of scary prompt and authentication to install an enterprise certificate as you do to enroll for MDM. Yes, users can still be socially engineered to compromise their device but that's an order of magnitude harder than just convincing someone to run some random binary.

              • Apocryphon 1054 days ago
                Apple can easily create a guarded sideloading experience replete with scary warnings. They can shape the sideloading UX to screen out users who aren't ready for it. Critics of iOS sideloading fail to see that Apple can add as many guards and guided experiences into sideloading as it does with its other features, such as installing enterprise certs.

                Even on Android with its openness and notoriety for security issues, to sideload APKs means having to navigate multiple settings menus to get to it, which screens out the majority of non-technical users. Though that is more playing to the "strength" of that platform (clunky design and unwieldiness), which Apple would not be doing.

                • acdha 1054 days ago
                  Yes, they certainly could — and the harder they make it the more people would then say they were being anti-competitive but if they make it easy a ton of people would be getting rooted after installing FreePorn.app just like we've seen for ages on Android, or simply honest developers being ripped off by people bootlegging their apps. I'm not in love with the current balance but having done support for a long time I think it's a pretty defensible position.

                  I personally think the best solution here is regulation: let Apple and Google control app distribution, with some legal mandates for prompt responses both to active threats as well as developer complaints, but cap the percentage they can charge.

                  • Apocryphon 1054 days ago
                    More people could claim that, but so long as Apple was doing it in good faith- as in, the opposite of what they did in the Netherlands to saltily, passive-aggressively flout Dutch regulation re: the App Store (0), the regulators will recognize that.

                    Most people do not bother to root their Android phones, and it's perfectly possible to pirate iOS apps on non-jailbroken phones, so that point is also moot. The people who are technically savvy enough to pirate apps are already doing so whether sideloading is made possible or not.

                    Finally, you discount Apple's mastery of UX patterns and their capability of design. By simply changing non-Messages texts to green, they created an entire dichotomy between messages from iOS devices, and non-iOS. Do you really think they wouldn't be capable of doing something similar towards official App Store-originated and non-official apps? Simply a subtle color tint, a badge, some off-looking font, and they can get users to look askance at sideloaded apps with caution. If Apple had to make sideloading possible, you can bet they can make it a secure and curated experience, as they do with the rest of their walled garden.

                    (0) https://twitter.com/marcoarment/status/1489595417117483010

                    https://news.ycombinator.com/item?id=30334683

                    • acdha 1054 days ago
                      > Most people do not bother to root their Android phones, and it's perfectly possible to pirate iOS apps on non-jailbroken phones, so that point is also moot. The people who are technically savvy enough to pirate apps are already doing so whether sideloading is made possible or not.

                      Your second sentence contradicts the first: the difference is not whether it’s theoretically possible but whether it’s easy and thus widespread. That’s the important distinction and it deserves more careful thought when you’re talking about devices large percentages of the population trust heavily.

                      > By simply changing non-Messages texts to green, they created an entire dichotomy between messages from iOS devices, and non-iOS.

                      Similarly, this deserves more careful thought: is the difference the color or the fact that one which isn’t controlled by the phone companies is faster, more capable, more secure and cost less for years?

                      Again, I’m not saying that Apple is perfect but it’s not as simple as people like to portray it, either. Multiple of the topics here are market failures where governments were deterred from stepping in. What Apple has done is not purely benevolent but it’s also satisfying a lot of users - they didn’t go from niche to one of the most important tech companies because they weren’t doing what people liked (and, no, it’s not “just” marketing or design).

                      • Apocryphon 1054 days ago
                        > the difference is not whether it’s theoretically possible but whether it’s easy and thus widespread. That’s the important distinction and it deserves more careful thought when you’re talking about devices large percentages of the population trust heavily.

                        Apple can allow sideloading without making it easy. This whole discussion is about whether or not Apple can craft a user experience that carefully balances between permitting users to use a powerful and potentially insecure ability, and carefully warning them against it, warning off those who are not savvy. I believe they can.

                        > Similarly, this deserves more careful thought: is the difference the color or the fact that one which isn’t controlled by the phone companies is faster, more capable, more secure and cost less for years?

                        I believe it's self-evident. The appearance of the different UI itself, which clashes against Apple's otherwise uniform design scheme, presents its own inherent psychological friction. Certainly, non-Messages apps come with certain poorer performances in Messages. The way group chats get broken up. The inability to respond to Messages. None of that has anything to do with phones being faster, more capable, or more secure. The difference in of itself is created by the users drawing the distinction in their own minds.

                        Similarly, I believe if Apple was to curate the sideloading experience, the majority of iPhone users can simply be guided away from that. And it will remain safely the province of developers and power users. Apple is simply too good at crafting user experiences to not be able to do that.

          • josephcsible 1054 days ago
            But the verification is obviously ineffective at stopping malware. If it were effective, then this wouldn't be a story.
            • mwint 1054 days ago
              Something can be simultaneously effective, and also fail at least once.
              • smoldesu 1054 days ago
                Sure, but as soon as you have that single failure, your system isn't flawless anymore. Apple's claims of comprehensive security are repeatedly proved wrong, and social engineering attacks like this indicate that their attack surface is larger than one might have initially realized.
                • reaperducer 1054 days ago
                  Sure, but as soon as you have that single failure, your system isn't flawless anymore.

                  I've never seen Apple claim that its system is flawless. Do you have a link?

            • mh8h 1054 days ago
              The thing is that when the system is effective at stoping the malware, you don't get a signal about it. You only see the ones that somehow made it. So it's not easy to have an idea of the effectiveness.
          • zamalek 1054 days ago
            To a layman? No difference at all. Remember that Apple claim to be the saviors of the ignorant.
        • tshaddox 1054 days ago
          > "Sideloading is dangerous because users could install malware" and "We'll let any iPhone sideload your app if you jump through some hoops and pay us" are incompatible statements, but Apple makes both of them.

          I'm missing how these are incompatible statements. You might as well say "if Apple thought certain apps could be dangerous they wouldn't have an App Store or even allow their own first-party apps on iPhones." Of course the position of Apple is that they should be able to approve every app/vendor before it's allowed to be installed on iPhones. The fact that they do approve some apps and vendors isn't incompatible with their position.

        • gumby 1054 days ago
          > The point I'm trying to make is that Apple isn't consistent here. If they actually believed this to be true there would be no way to get a sideloading cert that worked on all devices.

          Indeed. I'm (maybe) willing to sideload an app from my employer but not from some other company. Perhaps only managed devices should have this capability, and only for apps signed by the managing authority.

          I won't allow my employer to manage my personal phone; they have to issue me a phone if they want that kind of control. And in that case it's their device; they are welcome to manage it as they see fit.

          Also: this is different from TestFlight.

      • mcculley 1054 days ago
        The warning looks like this: https://support.apple.com/library/content/dam/edam/applecare...

        EDIT: Formerly I asked: What does the scary warning look like? Is there a screenshot of an example? I would like to show this to my employees and family and tell them never to trust such a cert.

        • joshstrange 1054 days ago
          Oops, just saw your edit. Well I just spent the last 10 minutes or so documenting the flow so I'll post it anyway: https://imgur.com/a/ofvfty8
          • mcculley 1054 days ago
            Thank you for documenting it!
        • dylan604 1054 days ago
          At least it doesn't have "Accept Anyways" type of button. The only option is to cancel whatever it was being attempted.
          • jtbayly 1054 days ago
            So how do you install it?
            • joshstrange 1054 days ago
              Here is the full flow: https://imgur.com/a/ofvfty8
              • dylan604 1054 days ago
                Who does this kind of app install, and what do these kinds of apps promise that makes this sound like it is worth doing?
                • joshstrange 1054 days ago
                  I believe I remember reading about Google or FB having an app for employees to order food from the cafeteria as well as apps for internal tools/build pipelines/etc. While some POS apps like Square are in the app store there are others that are distributed through enterprise apps. It allows companies to push updates faster when/if something goes wrong. I also know (at least in the past) NetJets used enterprise apps for their pilots, the apps had the flight manual among other things if I remember correctly.

                  Often it's custom or white labeled software that benefits from enterprise distribution.

                  EDIT: Also if an organization is large enough TestFlight might not be suitable to cover everything they need. For example you might want nightly developer builds, weekly/milestone internal test builds, stagings builds, and then TestFlight and release. You can accomplish this with multiple apps that are never released (My App - Dev, My App - QA) so you can use TestFlight for each and then only release the production "My App" but that has some sharp edges and requires everyone be added to the Apple development team.

                • bombcar 1054 days ago
                  If you're an enterprise, and have company-issued devices, you can use this to load company-specific apps onto your iPads/phones. Think ordering kiosk apps, stuff like that.

                  And since they're company devices, either IT can set them up before handing them out, or you can use tools to pre-install the certificates.

                  • dylan604 1054 days ago
                    If it's a company owned/provided device, then sure I'll accept whatever scary warning company says to. I will never accept that for a personal device. Ever.
                    • bombcar 1054 days ago
                      The problem is there are way too many "Scary Warnings™" out there, so anyone not completely versed in the technological world will just ignore them especially if they believe someone from "Tech Support" is telling them to do so.

                      It's a moderately hard problem to resolve.

                      • dylan604 1054 days ago
                        If it's a company device, I don't care what happens to it. So if it's bad, then it's IT's problem ;P

                        If it's my own device, I don't have to worry about it because it'll never happen.

                        This kiosk type stuff that was mentioned up thread is the type of stuff that could just as easily be run off of a corporate LAN site that I can access via my company provided compute device. The fact that the younger generations have been forced to accept BYOD as a work device is just sad to me.

                    • saagarjha 1054 days ago
                      Enterprise apps are “fine” for BYOD; they’re far better than invasive MDM.
                • Clent 1054 days ago
                  There are video streaming apps being distributed via this "exploit".
      • Dagonfly 1054 days ago
        > but the app store review process makes it harder to sneak one by.

        Imo the key question is: If you can find an exploit in the iOS sandbox, will the app store review really stop you? Compared to the expertise required to find such an exploit, it should be pretty trivial to obfuscate it or load the payload remotely after install.

        • duskwuff 1054 days ago
          > Imo the key question is: If you can find an exploit in the iOS sandbox, will the app store review really stop you?

          The App Store review is two-pronged -- it includes some human QA testing, and it also includes some automated screening of your executable, which includes checking for any suspicious library imports or strings. (This sometimes trips up applications which happen to use method names which happen to match up with Apple's internal APIs.) While it isn't entirely impossible for a malicious application to slip past this examination, it's significantly harder than for an Enterprise application which doesn't go through this process at all.

          • saagarjha 1054 days ago
            App Store Review’s automated scanning is quite trivial to get around, most iOS developers can either attest to doing so themselves or knowing someone who has done this.
      • jrochkind1 1054 days ago
        > And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.

        I believe you, but... so how does a "commercial spyware" company get one, and has it been cancelled by Apple after they see what they are up to?

        Or do their not-so-easy requirements... allow "commercial spyware" companies to fraudulently impersonate other apps with the cert?

        • LocalH 1052 days ago
          I know there have been several independent "signing services" driven by these type of enterprise certs, and usually when Apple finds out that they've been used like this, they'll revoke them. The only real time I've personally ran into this is when I was away from my computer for an extended period of time and needed to reinstall the on-device jailbreak app (which would have been unc0ver).
      • easrng 1054 days ago
        You can buy enterprise certs from other companies, that's how the stores like jailbreaks.fun (safe) or AppValley or AppCake (very questionable) get them.
      • usrn 1054 days ago
        There's malware on the AppStore. The review process is completely incapable of catching it.
      • aaomidi 1054 days ago
        Well people are being trained to install this due to MDM.
      • tshaddox 1054 days ago
        I couldn't agree with you more. I certainly see some very valid arguments for why Apple should allow sideloading on iPhones, but I am baffled by this extremely common argument that goes "here's an example of Apple not being restrictive enough to protect people, therefore Apple shouldn't be restrictive at all."
        • nickff 1054 days ago
          You're observing 'motivated reasoning', where someone started with the result (Apple is wrong to disallow side-loading), and came up with a way to support it.
    • rubynerd 1054 days ago
      It's next to impossible to get the clearance for an Apple Developer Enterprise account unless you know someone at Apple. It's necessary to have an Enterprise account to sign MDM certificates, so I've had an application open for over six months without hearing from them, and the first application was rejected after 10 months without any dialogue.

      With this article shining yet more negative light on the program after the Facebook/Google spying-on-the-internet-access-of-kids debacle effectively shut the Enterprise program down, the MDM space will be even harder to innovate in, considering no startup will ever meet the required bar for signing up for an Enterprise account.

    • blakesterz 1054 days ago
      > Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.

      While I wouldn't say that's exactly wrong, if this type of thing happened often I would think that we wouldn't see Google writing this up as interesting. Doesn't seeing this mean it is a rare and noteworthy event and more evidence that iOS's limitations on sideloading do improve security? I'm not sure how often this happens, so I could be way off.

    • GeekyBear 1054 days ago
      > Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.

      This IS a case of using an enterprise certificate to sideload an app.

      If anything, this proves that sideloading, even with scary warnings to the users, enables malware.

    • bfgoodrich 1054 days ago
      > In reality it costs the bad guys $299 to bypass this limitation

      They also had to get verified as a mid-sized or greater corporation (Apple does use real verification partners), and further Apple can pull the rug on the certificate in an instant, immediately invaliding that $299 certificate and the considerable effort that went into getting it.

      In reality this group likely had to hack an existing Apple Enterprise approved business first, then using that to springboard to the next step.

      Casually dismissing that enormous gate is pretty tenuous.

      • mike_d 1054 days ago
        I don't have any inside details on this case, but I highly suspect the signing certificate was stolen from a legitimate user.

        Organizations sophisticated enough to build something like this already target organizations like device manufactures to get kernel driver signing certificates on Windows.

    • sandworm101 1054 days ago
      Considering how much apple hardware costs, I find it difficult to believe that the average apple user wouldn't be able to scrape together 299$. That's about half the price of a budget iPhone in my area.
      • cmeacham98 1054 days ago
        The current-gen iPhone SE (i.e. what you'd likely buy if you wanted a budget iPhone) costs $429 here in the US. Additionally, you often can get an even better deal if you're willing to sign a contract with a carrier.

        Even if I take your prices at face value, I'm not sure "Apple charges 1/2 the price of the phone to unlock sideloading" to be a killer argument.

      • Closi 1054 days ago
        We aren't talking about the average user - this is the licence for a medium-sized enterprise licence to write/develop their own apps.

        You need to be a verified business to get this.

  • miohtama 1054 days ago
    The exploit is quite complicated to pull together. Would there be any chance that someone created it based on iOS sources? I assume NSO and such actors would already have bought stolen source codes.
    • rs_rs_rs_rs_rs 1054 days ago
      Be a competent reverse engineer and you have the sources to everything without the need to stole it.

      The people that build these things are just that good.

    • mike_d 1054 days ago
      You don't need stolen source, just knowledge.

      Grayshift, a company that makes devices that unlock iPhones for law enforcement, was started by two ex-Apple security engineers.

  • ajross 1054 days ago
    What's interesting to me is that on its face Apple's architecture was the right thing from the perspective of modern security thought: split out "driver" layers that can be driven (in a reasonably direct way) by untrusted code and put them on dedicated hardware. That way you're insulated from the Spectre/Meltdown et. al. family of information leaks due to all the cached state on modern devices.

    Except the software architecture needed to make this happen turns out to be so complicated that it effectively opens up new holes anyway.

    (Also: worth noting that this is a rare example of an "Inverse Conway's Law" effect. A single design from a single organization had complicated internal interconnectivity, owing to the fact that it grew out of an environment with free internal communication. So an attempt to split it up turned into a mess. Someone should have come in and split the teams properly and written an interface spec.)

  • jrochkind1 1054 days ago
    > This sideloading works because the app is signed with an enterprise certificate, which can be purchased for $299 via the Apple Enterprise developer program.

    From linked post, the actor is identified as a "commercial spyware" company.

    So... I'd like to assume that Apple has cancelled their enterprise cert and will refuse to sell them another after this abuse, right? Surely there are terms of service that forbid using an enterprise cert maliciously, to fraudulently pretend to be another app and trick users that are not part of your "enterprise"?

    Right? (cue Anakin and Padme meme).

    But seriously... will they?

    • rubatuga 1054 days ago
      They probably will cancel it. Seeing as they cancelled Google's certificate at one point.
  • Szpadel 1054 days ago
    if that's so "easy" for enterprise to get side loading to work, why eg. epic games won't go that route to provide apps outside app store? am i missing something?
    • tech234a 1054 days ago
      Apple has a number of requirements that enterprises must meet [1] in order to be eligible for an enterprise distribution certificate. Apple can revoke the certificate at any time if they discover misuse, which would quickly become obvious if a large company such as Epic Games started publicly distributing their games this way.

      [1]: https://developer.apple.com/programs/enterprise/

    • saltspork 1054 days ago
      Apple revokes enterprise certs that it discovers being used to distribute apps to users outside of said enterprise.