_______               __                   _______
       |   |   |.---.-..----.|  |--..-----..----. |    |  |.-----..--.--.--..-----.
       |       ||  _  ||  __||    < |  -__||   _| |       ||  -__||  |  |  ||__ --|
       |___|___||___._||____||__|__||_____||__|   |__|____||_____||________||_____|
                                                             on Gopher (inofficial)
 (HTM) Visit Hacker News on the Web
       
       
       COMMENT PAGE FOR:
 (HTM)   Oneplus phone update introduces hardware anti-rollback
       
       
        Aissen wrote 19 min ago:
        This does not surprise me from the company that accidentally deleted
        the widevine L1 certificate on my phone (that never had any third party
        OS) during an update and could not restore it, nor would it replace the
        motherboard (for which it claimed it was the only possible fix).
       
        veunes wrote 3 hours 38 min ago:
        If this becomes the norm, it effectively ends the idea that you own the
        hardware you paid for
       
        Oxodao wrote 6 hours 22 min ago:
        Oneplus went shit since the 6. Pretty sad, they used to be a great
        brand...
       
        neals wrote 7 hours 18 min ago:
        How does an eFuse even work?
       
        cmxch wrote 11 hours 57 min ago:
        So OnePlus is no better than the rest of the pack.
       
        direwolf20 wrote 12 hours 10 min ago:
        I thought they were the one okay manufacturer. Guess not.
       
        jnwatson wrote 13 hours 26 min ago:
        So much ignorance in this thread. There's nothing new here. All
        manufacturers worth their salt have this feature.
        
        This is ultimately about making the device resistant to downgrade
        attacks.  This is what discourages thieves from stealing your phone.
       
          concinds wrote 13 hours 0 min ago:
          I've been dismayed by how fast the "we should own our hardware" crowd
          has so quickly radicalized into "all security features are evil", and
          "no security features should exist for anyone".
          
          Not just "there should be some phone brands that cater to me", but
          "all phone brands, including the most mainstream, should cater to me,
          because everyone on earth cares more about 'owning their hardware'
          than evil maid attack prevention, Cellebrite government surveillance,
          theft deterrence, accessing their family photos if they forget their
          password, revocable code-signing with malware checks so they don't
          get RATs spying on their webcam, etc, and if they don't care about
          'owning their hardware' more than that, they are wrong".
          
          It is objectively extremist and fanatical.
       
            userbinator wrote 11 hours 17 min ago:
            Given how the opposition has radicalized into "you should own
            nothing and be happy", it's not surprising.
            
            None of the situations you mentioned are realistic or even worth
            thinking about for the vast majority of the population. They're
            just an excuse to put even more control into the manufacturer's
            hands.
       
            bri3d wrote 11 hours 43 min ago:
            I’ve posted about this on HN before; I think that there’s a
            dangerous second-order enshittification going on where people are
            so jaded by a few bad corporate actions that they believe that
            everyone is out to get them and hardware is evil. The most
            disappointing thing to me is that this has led to a complete
            demolition of curiosity; rather than learning that OTP is an
            ancient and essential concept in hardware, the
            brain-enshittification has led to “I see hardware anti-*, I click
            It’s Evil” with absolutely no thought or research applied.
       
            ShroudedNight wrote 11 hours 46 min ago:
            "No security features should exist for anyone" is itself
            fanatically hyperbolic narrative. The primary reason this event has
            elicited such a reaction is because OnePlus has historically been
            perceived as one of the brands specifically catering to people that
            wanted ultimate sovereignty over their devices.
            
            As time goes on, the options available for those that require such
            sovereignty seem to be thinning to such an extent that [at least
            absent significant disposable wealth] the remaining options will
            appear to necessitate adopting lifestyle changes comparable to
            high-cost religious practices and social withdrawal, and likely
            without the legal protections afforded those protected classes.
            Given the "big tech's" general hostility to user agency and
            contempt for values that don't consent to being subservient to its
            influence peddling, intense emotional reaction to loss of already
            diminished traditional allies seem like something that would
            reasonably viewed compassionately, rather than with hostility.
       
          foxes wrote 13 hours 20 min ago:
          How is graphene considered the most secure phone os but you can still
          flash on new firmware?
          
          I don't care if they can downgrade the device, just that I boot into
          a secure verified environment, and my data is protected.
          
          I also think thieves will just grab your phone regardless, they can
          still sell the phone for parts, or just sell it anyway as a scam etc.
       
            jnwatson wrote 11 hours 29 min ago:
            The attack is simple:  the attacker downgrades the phone to a
            version of firmware that has a vulnerability.  The attacker then
            uses the vulnerability to get at your data.  Your data is
            PIN-protected? The attacker uses the vulnerability to disable the
            PIN lockout and tries all of them.
            
            There's over a 10x difference in fence price between a locked and
            unlocked phone. That's a significant incentive/deterrent.
       
              foxes wrote 10 hours 15 min ago:
              Don't pixels have a security chip that is supposed to make that
              infeasible?
              
              It has some increasing timer for auth, and if you  try and
              factory reset it - it destroys all the data?
              
              As I said its less important that the thief can boot a new os,
              the security of my data is more important. How is that
              compromised?
              
              It feels like a thief is just going to opportunistically grab a
              phone from you rather than analyse what device it is.
       
        abhaynayar wrote 13 hours 38 min ago:
        Damn, I just saw that update yesterday on my phone and did not update
        it for no reason. Turned off auto-update right now until I figure out
        what to do.
       
        zb3 wrote 14 hours 13 min ago:
        It's Google's fault. I want to buy a smartphone without AVB at all.
        With no "secure boot" fuse blown (yes I DO know that this is not the
        same fuse) and ideally I'd want to provision my own keys.
        
        But vendors wouldn't be able to say the device runs "Android" as it's
        trademarked. AVB is therefore mandatory and in order for AVB to be
        enforced, you can't really control the device - unlocking the
        bootloader gives you only partial control, you can't flash your own
        "abl" to remove AVB entirely.
        
        But I don't want AVB and I can't buy such device for money.. this isn't
        free market, this is Google monopoly..
       
          digiown wrote 12 hours 10 min ago:
          The closest thing you can get is probably the Pixel, ironically. You
          can provision your own keys, enroll it into AVB, and re-lock the
          bootloader. From the phone hardware's perspective there is no
          difference between your key and Google's. No fuse is ever blown.
       
            zb3 wrote 11 hours 57 min ago:
            That's not really true, there will be a warning shown that "the
            phone is loading a different operating system" - I've seen that
            when installing GrapheneOS on my pixel.
            
            But it's not just about that, it's about the fact that I can't
            flash my own "abl" or the software running in the TrustZone there
            at all as I don't control the actual signing keys (not
            custom_avb_key) and I'm not "trusted" by my own device.. There were
            fuses blown as evident by examining abl with its fastboot commands
            - many refuse to work saying I can't use it on a "production
            device". Plus many of those low-level partitions are closed source
            proprietary blobs..
            
            Yes yes - I DO understand that for most people this warning is
            something positive, otherwise you could buy a phone with modified
            software without realizing it and these modifications could make it
            impossible to restore the original firmware.
       
              digiown wrote 11 hours 48 min ago:
              Ah, I forgot about the warning. Are the blown fuses you're
              talking about related to to your unlocking though? Or did they
              just remove the debug functions. I guess it reduces the attack
              surface somewhat.
              
              I do agree it's far from ideal though. But there are so many,
              much worse offenders that uses these fuses to actually remove
              features, and others that do not allow installing a different OS
              at all. The limited effort should probably be spent on getting
              rid of those first.
       
                zb3 wrote 11 hours 31 min ago:
                I'm not sure I'd agree with your last conclusion, we as
                consumers can choose what to buy, so for me the situation where
                there's one brand that produces open devices (with competing
                specs, not like pinephone..) where I could install
                postmarketos/ubuntu touch without any parts of android would be
                better than there being many brands producing smartphones
                allowing only basic unlocking and without open firmware.
                
                Of course there are bigger problems in the ecosystem, like Play
                Integrity which actively attempt to punish me for buying open
                hardware. Unfortunately that's the consequence of putting
                "trusted" applications where they IMO don't belong - there are
                smartcards with e-ink displays and these could be used for
                things like banking confirmations, providing the same security
                but without invading my personal computing devices. But thanks
                to Android and iOS, banks/governments went for the anti-user
                option.
       
        peterhon wrote 14 hours 49 min ago:
        Unfortunately similar things will be mandated by EU law through cyber
        resiliance act (CRA) in order to ensure tamper free boot of any kind of
        device sold in the EU from Dec 2027.
        
        Basically breaking any kind of FOSS or repairability, creating dead HW
        bricks if the vendor ceases to maintain or exist.
       
          veunes wrote 3 hours 31 min ago:
          What's worrying isn't the CRA itself, but that companies may use it
          as cover to lock things down more than necessary
       
          utopiah wrote 7 hours 54 min ago:
          Shouldn't the EU then escrow keys?
       
        MarkusWandel wrote 14 hours 57 min ago:
        That's insane.    If the CPU has enough fuses (which according to the
        wiki it does) why the h*ck can't they just make it impossible to
        reflash the >= minimum previously installed version of the OS after
        preventing the downgrade?  Why the hard brick?
       
        geor9e wrote 15 hours 15 min ago:
        This has been a commonplace feature on SOCs for a decade or two now.
        The comments seem to be taking this headline as
        out‑of‑the‑ordinary news, phrased as if Oneplus invented it. Even
        cheapo devices often use an eFuse as anti-rollback. We do it at my work
        whenever root exploits are found that let you run unsigned code. If we
        don't blow an eFuse, then those security updates can just be undone,
        since any random enemy with hardware access could plug in a USB cable,
        flash the older exploitable signed firmware, steal your personal data,
        install a trojan, etc. I get the appeal of ROMs/jailbreaking/piracy but
        it relies on running obsolete exploitable firmware. It's not like
        they're forcing anyone to install the security patch who doesn't want
        it. This is normal.
       
          veunes wrote 3 hours 34 min ago:
          On most devices, anti-rollback means "older firmware won't boot" or
          "you lose secure features." Here it seems to mean "try it and you
          permanently brick the device," with no warning in the updater and no
          public statement explaining the change
       
          nirui wrote 7 hours 29 min ago:
          > any random enemy with hardware access could plug in a USB cable,
          flash the older exploitable signed firmware, steal your personal
          data, install a trojan, etc
          
          A lot of my phones stopped receiving firmware updates long ago, the
          manufacturer just simply stopped providing them. The only way to
          safely use them is to install custom firmware that are still address
          the problems, and this eFuse thing can be used to prevent custom
          firmware.
          
          This eFuse is part of the plot to prevent user from accessing open
          source firmware, it's just that. Your "user safety" jargon cannot
          confuse people anymore, after all the knowledge people (at least the
          smart few) has learned during the years.
       
            zozbot234 wrote 6 hours 40 min ago:
            > and this eFuse thing can be used to prevent custom firmware.
            
            This is not what's happening here, though.
       
          troyvit wrote 8 hours 44 min ago:
          > since any random enemy with hardware access
          
          Once they have hardware access who cares? They either access my data
          or throw it in a lake. Either way the phone is gone and I'd better
          have had good a data backup and a level of encryption I'm comfortable
          with.
          
          This not only makes it impossible to install your own ROMs, but
          permanently bricks the phone if you try. That is not something my
          hardware provider will ever have the choice to make.
          
          It's just another nail in the coffin of general computing, one more
          defeat of what phones could have been, and one more piece of personal
          control that consumers will be all too happy to give up because of
          convenience.
       
          g947o wrote 12 hours 54 min ago:
          Sounds like that should be an option in "Developer Options" that
          defaults to true, and can only be disabled after re-authentication /
          enterprise IT authorization. I don't see anything lost for the user
          if it were done this way.
       
          palijer wrote 13 hours 51 min ago:
          It ain't normal to me. If I bought a phone, I should be able to
          decide that I want to run different software on it.
          
          Let's say OP takes a very different turn with their software that I
          am comfortable with - say reporting my usage data to a different
          country. I should be able to say "fuck that upgrade, I'm going to run
          the software that was on my phone when I originally bought it"
          
          This change blocks that action, and from my understanding if I try to
          do it, it bricks my phone.
       
            jnwatson wrote 13 hours 29 min ago:
            The whole point of this is so that when someone steals your phone,
            they can't install an older vulnerable version of the firmware than
            can be used to set it back to factory settings which makes it far
            more valuable for resale.
       
              echelon wrote 10 hours 45 min ago:
              I'm fine with a total loss of hardware. I'd rather the hardware
              do what I want. I own it.
       
              palijer wrote 11 hours 31 min ago:
              Phone thieves aren't checking which phone brand I have before
              they knick my phone. Your scenerio is not improved by making
              Oneplus phones impossible to use once they're stolen.
       
                creato wrote 11 hours 12 min ago:
                It reduces the expected value of stealing a phone, which
                reduces the demand for stolen phones.
       
                  AnthonyMouse wrote 7 hours 43 min ago:
                  > It reduces the expected value of stealing a phone, which
                  reduces the demand for stolen phones.
                  
                  It's not at all obvious that this is what happens. To begin
                  with, do you regard the average phone thief as someone who
                  even knows what expected value is?
                  
                  They want drugs so they steal phones until they get enough
                  money to buy drugs. If half the phones can't be resold then
                  they need to steal twice as many phones to get enough money
                  to buy drugs; does that make phone thefts go down or up?
                  
                  On top of that, the premise is ridiculous. You don't need to
                  lock the boot loader or prevent people from installing third
                  party software to prevent stolen phones from being used. Just
                  establish a registry for the IMEI of stolen phones so that
                  carriers can consult the registry and refuse to provide
                  service to stolen phones.
                  
                  It's entirely unrelated to whether or not you can install a
                  custom ROM and is merely being used as an excuse because
                  "prevent theft somehow" sounds vaguely like a legitimate
                  reason when the actual reason of "prevent competition" does
                  not.
       
                  palijer wrote 10 hours 43 min ago:
                  I find it hard to believe that Oneplus is spending
                  engineering and business recourses, upsetting a portion of
                  their own userbase, and creating more e-waste because they
                  want to reduce the global demand for stolen phones. They only
                  have like 3% of the total market, they can't realistically
                  move that needle.
                  
                  I don't understand what business incentives they would have
                  to make "reduce global demand for stolen phones" a goal they
                  want to invest in.
       
                    charcircuit wrote 10 hours 3 min ago:
                    This is a security feature from Qualcomm. So there is
                    little of their own time spent on this.
       
                      ValdikSS wrote 7 hours 57 min ago:
                      And it is a SoC requirement for Android certification.
       
              QuiEgo wrote 13 hours 8 min ago:
              It'd be ideal if the phone manufacturer had a way to delegate
              trust and say "you take the risk, you deal with the consequences"
              - unlocking the bootloader used to be this. Now we're moving to
              platforms treating any unlocked device as uniformly untrusted,
              because of all of the security problems your untrusted device can
              cause if they allow it inside their trust boundary.
              
              We cant have nice things because bad people abused it :(.
              
              Realistically, we're moving to a model where you'll have to have
              a locked down iPhone or Android device to act as a trusted device
              to access anything that needs security (like banking), and then a
              second device if you want to play.
              
              The really evil part is things that don't need security (like
              say, reading a website without a log in - just establishing a TLS
              session) might go away for untrusted devices as well.
       
                fc417fc802 wrote 9 hours 34 min ago:
                > We cant have nice things because bad people abused it :(.
                
                You've fallen for their propaganda. It's a bit off topic from
                the Oneplus headline but as far as bootloaders go we can't have
                nice things because the vendors and app developers want control
                over end users. The android security model is explicit that the
                user, vendor, and app developer are each party to the process
                and can veto anything. That's fundamentally incompatible with
                my worldview and I explicitly think it should be legislated out
                of existence.
                
                The user is the only legitimate party to what happens on a
                privately owned device. App developers are to be viewed as
                potential adversaries that might attempt to take advantage of
                you. To the extent that you are forced to trust the vendor they
                have the equivalent of a fiduciary duty to you - they are
                ethically bound to see your best interests carried out to the
                best of their ability.
       
                  QuiEgo wrote 9 hours 20 min ago:
                  > That's fundamentally incompatible with my worldview and I
                  explicitly think it should be legislated out of existence.
                  
                  The model that makes sense to me personally is that private
                  companies should be legislated to be absolutely clear about
                  what they are selling you. If a company wants to make a
                  locked down device, that should be their right. If you don't
                  want to buy it, that's your absolute right too.
                  
                  As a consumer, you should be given the information you need
                  to make the choices that are aligned with your values.
                  
                  If a company says "I'm selling you a device you can root",
                  and people buy the device because it has that advertised,
                  they should be on the hook to uphold that promise. The nasty
                  thing on this thread is the potential rug pull by Oneplus,
                  especially as they have kind of marketed themselves as the
                  alternative to companies that lock their devices down.
       
                    fc417fc802 wrote 8 hours 55 min ago:
                    I don't entirely agree but neither would I be dead set
                    against such an arrangement. Consider that (for example)
                    while private banks are free not to do business with you at
                    least in civilized countries there is a government
                    associated bank that will always do business with anyone.
                    Mobile devices occupy a similar space; there would always
                    need to be a vendor offering user controllable devices. And
                    we would also need legal protections against app authors
                    given that (for example) banking apps are currently picking
                    and choosing which device configurations they will run on.
                    
                    I think it would be far simpler and more effective to
                    outlaw    vendor controlled devices. Note that wouldn't
                    prevent the existence of some sort of opt-in key escrow
                    service where users voluntarily turn over control of the
                    root of trust to a third party (possibly the vendor
                    themselves).
                    
                    You can already basically do this on Google Pixel devices
                    today. Flash a custom ROM, relock the bootloader, and
                    disable bootloader unlocking in settings. Control of the
                    device is then held by whoever controls the keys at the
                    root of the flashed ROM with the caveat that if you can log
                    in to the phone you can re-enable bootloader unlocking.
       
                charcircuit wrote 10 hours 1 min ago:
                >and then a second device if you want to play.
                
                With virtualization this could be done with the same device.
                The play VM can be properly isolated from the secure one.
       
                  fc417fc802 wrote 9 hours 40 min ago:
                  How is that supposed to fix anything if I don't trust the
                  hypervisor?
                  
                  It's funny, GP framed it as "work" vs "play" but for me it's
                  "untrusted software that spies on me that I'm forced to use"
                  vs "software stack that I mostly trust (except the firmware)
                  but BigCorp doesn't approve of".
       
                    charcircuit wrote 9 hours 12 min ago:
                    Then yes you will need a another device. Same if you don't
                    trust the processor.
       
                      fc417fc802 wrote 8 hours 36 min ago:
                      > Same if you don't trust the processor.
                      
                      Well I don't entirely, but in that case there's even less
                      of a choice and also (it seems to me) less risk. The OEM
                      software stack on the phone is expected to phone home. On
                      the other hand there is a strong expectation that a CPU
                      or southbridge or whatever other chip will not do that on
                      its own. Not only would it be much more technically
                      complex to pull off, it should also be easy to confirm
                      once suspected by going around and auditing other
                      identical hardware.
                      
                      As you progress down the stack from userspace to OS to
                      firmware to hardware there is progressively less
                      opportunity to interact directly with the network in a
                      non-surreptitious manner, more expectation of isolation,
                      and it becomes increasingly difficult to hide something
                      after the fact. On the extreme end a hardware backdoor is
                      permanently built into the chip as a sort of physical
                      artifact. It's literally impossible to cover it up after
                      the fact. That's incredibly high risk for the
                      manufacturer.
                      
                      The above is why the Intel ME and AMD PSP solutions are
                      so nefarious. They normalize the expectation that the
                      hardware vendor maintains unauditable, network capable,
                      remotely patchable black box software that sits at the
                      bottom of the stack at the root of trust. It's literally
                      something out of a dystopian sci-fi flick.
       
        plutokras wrote 15 hours 28 min ago:
        Nintendo has been doing this for ages.
        
 (HTM)  [1]: https://news.ycombinator.com/item?id=30773214
       
        InsomniacL wrote 15 hours 30 min ago:
        Does intentionally physically damaging a device fall foul of any laws
        that a software restriction otherwise wouldn't?
       
        poizan42 wrote 15 hours 31 min ago:
        Does anyone know if it has been confirmed that this only applies to the
        "ColorOS" branded firmware versions? Because I currently have an update
        to OxygenOS  16.0.3.501 pending on my OnePlus 15, which is presumably
        built from the same codebase.
        
        Edit: It seems that this does apply to OxygenOS too:
        
 (HTM)  [1]: https://xdaforums.com/t/critical-warning-coloros-16-0-3-501-up...
       
        piskov wrote 15 hours 45 min ago:
        So that’s how in an event of war US adversaries will be relieved of
        their devices
        
        > The anti-rollback mechanism uses Qfprom (Qualcomm Fuse Programmable
        Read-Only Memory), a region on Qualcomm processors containing one-time
        programmable electronic fuses.
        
        What a nice thoughtful people to build such a feature.
        
        That’s why you sanction the hell out of Chinese Loongson or Russian
        Baikal pity of CPU — harder to disable than programmatically
        “blowing a fuse”.
       
          ValdikSS wrote 8 hours 14 min ago:
          Baikal definitely has anti-rollback, and Loongson should have it too.
          That's a common feature.
          
          As of efuses, they are present essentially anywhere. In any SoC and
          microcontroller. They are usually used to store secrets (keys) and
          for chip configuration.
          
          The linked wiki article written in a way that the reader might assume
          that OnePlus did something wrong, unique, anti-consumer, or something
          along the lines. Quite the contrary: OnePlus issued updated official
          firmware with burned the anti-rollback bit to prevent older
          vulnerable official firmware from being installed. Either new
          bootloader-level vulnerability has been found, or some kind of
          bootloader-level secret has leaked from OnePlus, with which the
          attacker can gain access to the smartphone's data it should not have.
          By this update, OnePlus secured data of the smartphone owners again.
          
          You still can unlock the bootloader and install custom firmware (with
          bumped anti-rollback version in the firmware metadata I guess, that
          would require newer custom firmware or a recompilation/header
          modification for the older). Your device with the custom firmware
          installed won't receive the official firmware update to begin with,
          so it could not be bricked.
       
          UltraSane wrote 10 hours 32 min ago:
          This is absurdly paranoid with absolutely zero evidence. For embedded
          and mobile threat models where physical access or bootloader unlock
          is possible, eFuses are effectively mandatory for robust downgrade
          prevention
       
            fc417fc802 wrote 9 hours 5 min ago:
            Agreed that robust downgrade prevention is necessary. However it's
            not paranoid at all and the problem isn't limited to eFuses. A
            network connected device that the vendor ultimately controls is a
            device that can be remotely disabled at the vendor's whim. It's
            like a hardware backdoor except it's out in the open and much more
            capable.
       
          KennyBlanken wrote 12 hours 34 min ago:
          This has been going on for a long, long time. Motorola used to make
          Android phones that would burn an efuse in the SoC if it thought it
          was being rooted or jailbroken, bricking the phone.
       
          QuiEgo wrote 13 hours 26 min ago:
          OTP memory is a key building block of any secure system and likely on
          any device you already have.
          
          Any kind of device-unique key is likely rooted in OTP (via a seed or
          PUF activation).
          
          The root of all certificate chains is likely hashed in fuses to
          prevent swapping out cert chains with a flash programmer.
          
          It's commonly used to anti rollback as well - the biggest news here
          is that they didn't have this already.
          
          If there's some horrible security bug found in an old version of
          their software, they have no way to stop an attacker from loading up
          the broken firmware to exploit your device? That is not aligned with
          modern best practices for security.
       
            mrsssnake wrote 12 hours 49 min ago:
            > they have no way to stop an attacker from loading up the broken
            firmware to exploit your device
            
            You mean the attacker having a physical access to the device
            plugging in some USB or UART, or the hacker that downgraded the
            firmware so it can use the exploit in older version to downgrade
            the firmware to version with the exploit?
       
              mschuster91 wrote 12 hours 8 min ago:
              > You mean the attacker having a physical access to the device
              plugging in some USB or UART
              
              ... which describes US border controls or police in general. Once
              "law enforcement" becomes part of one's threat model, a lot of
              trade-offs suddenly have the entire balance changed.
       
              QuiEgo wrote 12 hours 30 min ago:
              Sure. Or the supply chain attacker (who is perhaps a state-level
              actor if you want to think really spicy thoughts) selling you a
              device on Amazon you think is secure, that they messed with when
              it passed through their hands on its way to you.
       
                c22 wrote 10 hours 41 min ago:
                The state level supply chain attacker can just replace the
                entire chip, or any other part of the product. No amount of
                technical wizardry can prevent this.
       
                  QuiEgo wrote 10 hours 25 min ago:
                  Modern devices try to prevent this by cryptographically
                  entangling the firmware on the flash to the chip - e.x.
                  encrypting it with a device-unique key from a PUF. So if you
                  replace the chip, it won't be able to decrypt the firmware on
                  flash or boot.
                  
                  The evil of the type of attack here is that the firmware with
                  an exploit would be properly signed, so the firmware update
                  systems on the chip would install it (and encrypt it with the
                  PUF-based key) unless you have anti-rollback.
                  
                  Of course, with a skilled enough attacker, anything is
                  possible.
       
          RobotToaster wrote 13 hours 55 min ago:
          >That’s why you sanction the hell out of Chinese Loongson or
          Russian Baikal
          
          I assume that's also why China is investing so heavily into open
          source risc-v
       
          nippoo wrote 14 hours 23 min ago:
          eFuses have been a thing forever on almost all MCUs/processors, and
          aren't some inherently "evil" technology - mostly they're used in
          manufacturing when you might have the same microcontroller/firmware
          on separate types of boards. I'm working on a board right now which
          is either an audio input or an output (depending on which components
          are fitted) and one or the other eFuse is burned to set which one it
          is, so subsequent firmware releases won't accidentally set a GPIO as
          an output rather than an input and potentially damage the device.
       
            direwolf20 wrote 12 hours 42 min ago:
            Isn't this normally done with a GPIO bootstrap?
       
              QuiEgo wrote 11 hours 18 min ago:
              It depends. Usually there are enough "knobs" that adding that
              many balls to the package would be crazy expensive at volume.
              
              Most SoCs of even moderate complexity have lots of redundancy
              built in for yield management (e.x. anything with RAM expects
              some % of the RAM cells to be dead on any given chip), and uses
              fuses to keep track of that. If you had to have a strap per RAM
              block, it would not scale.
       
          Muromec wrote 15 hours 38 min ago:
          This kind of thing is generally used to disallow downgrading the
          bootloader once there is a bug in chain of trust handling of the
          bootloader. Otherwise once broken is forever broken. It makes sense
          from the trusted computing perspective to have this. It's not even
          new, it was still there on p2k motorollas 25 years ago.
          
          You may not want trusted computing and root/jailbreak everything as a
          consumer, but building one is not inherently evil.
       
            wolvoleo wrote 14 hours 24 min ago:
            Trusted computing means trusted by the vendor and content
            providers, not trusted by the user. In that sense I consider it
            very evil.
       
              UltraSane wrote 10 hours 28 min ago:
              Pre-TC mobile/embedded security was catastrophic:
              
                Persistent bootkits trivial to install
                No verified boot chain
                Firmware implants survived OS reinstalls
                No hardware-backed key storage
                Encryption keys extractable via JTAG/flash dump
              
              Modern Secure Boot + hardware-backed keystore + eFuse
              anti-rollback eliminated entire attack classes. The median user's
              security posture improved by orders of magnitude.
       
                michaelmrose wrote 9 hours 56 min ago:
                Did this ever effect real users?
       
                  QuiEgo wrote 9 hours 14 min ago:
                  Yes. See attacks like Pegasus.
       
                  fc417fc802 wrote 9 hours 16 min ago:
                  Arguably yes. By preventing entire classes of attack real
                  users are never exposed to certain risks in the first place.
                  If it were possible it would be abused at some rate (even if
                  that rate were low).
                  
                  It's not that trusted computing is inherently bad. I actually
                  think it's a very good thing. The problem is that the
                  manufacturer maintains control of the keys when they sell you
                  a device.
                  
                  Imagine selling someone a house that had smart locks but not
                  turning over control of the locks to the new "owner". And
                  every time the "owner" wants to add a new guest to the lock
                  you insist on "reviewing" the guest before agreeing to add
                  him. You insist that this is important for "security" because
                  otherwise the "owner" might throw a party or invite a drug
                  dealer over or something else you don't approve of. But don't
                  worry, you are protecting the "owner" from malicious third
                  parties hiding in plain sight. You run thorough background
                  checks on all applicants after all!
       
              charcircuit wrote 14 hours 7 min ago:
              If the user doesn't trust an operating system, why would they use
              it. The operating system can steal sensitive information. Trusted
              computing is trusted by the user to the extent that they use the
              device. For example if they don't trust it, they may avoid
              logging in to their bank on it.
       
                LoganDark wrote 11 hours 10 min ago:
                To trust an Android device, I need to have ultimate authority
                over it. That means freedom to remove functionality I don't
                like and make changes apps don't like. Otherwise, there are
                parts of practically every Android that I don't approve of,
                like the carrier app installer, any tracking/telemetry, most
                preinstalled apps, etc.
                
                I recently moved to Apple devices because they use trusted
                computing differently; namely, to protect against platform
                abuse, but mostly not to protect corporate interests. They also
                publish detailed first-party documentation on how their
                platforms work and how certain features are implemented.
                
                Apple jailbreaking has historically also had a better UX than
                Android rooting, because Apple platforms are more trusted than
                Android platforms, meaning that DRM protection, banking apps
                and such will often still work with a jailbroken iOS device,
                unlike most rooted Android devices. With that said though, I
                don't particularly expect to ever have a jailbroken iOS device
                again, unfortunately.
                
                Apple implements many more protections than Android at the OS
                level to prevent abuse of trusted computing by third-party
                apps, and give the user control. (Though some Androids like,
                say, GrapheneOS, implement lots that Apple does not.)
                
                But of course all this only matters if you trust Apple. I trust
                them less than I did, but to me they are still the most
                trustworthy.
       
                  charcircuit wrote 10 hours 33 min ago:
                  >to protect against platform abuse, but mostly not to protect
                  corporate interests
                  
                  What do you mean by this? On both Android and iOS app
                  developers can have a backend that checks the status of app
                  attestation.
       
                    LoganDark wrote 2 hours 39 min ago:
                    "App attestation" means different things for Android than
                    for iOS. On iOS, it verifies the app was installed from the
                    right place. On Android, it tries to check if the device is
                    tampered with, or hasn't been fully certified by Google, or
                    etc... Android's far more finicky because Google uses this
                    process to crack down on OEMs and hobbyists, while Apple
                    implicitly trusts itself.
                    
                    Also, "checking the status of app attestation" is the wrong
                    approach. If you want to use app attestation that way, then
                    you should sign/encrypt communications (requests and
                    responses) with hardware-backed keys; that way, you can't
                    replay or proxy an attestation result to authorize modified
                    requests.
                    
                    (I believe Apple attestation doesn't directly support
                    encryption itself, only signing, but that is enough to use
                    it as part of a key exchange process with hardware-backed
                    keys - you can sign a public key you're sending to the
                    server, which can verify your signature and then use your
                    public key to encrypt a server-side public key, that then
                    you can decrypt and use to encrypt your future
                    communications to the server, and the server can encrypt
                    its responses with your public key, etc.)
       
                bigyabai wrote 11 hours 27 min ago:
                Do you actually, bottom-of-your-heart believe that ordinary
                consumers think like this? They use TikTok and WhatsApp and
                Facebook and the Wal-Mart coupon app as a product of deep
                consideration on the web of trust they're building?
                
                Users don't have a choice, and they don't care. Bitlocker is
                cracked by the feds, iOS and Android devices can get unlocked
                or hacked with commercially-available grey-market exploits.
                Push Notifications are bugged, apparently. Your logic hinges on
                an idyllic philosophy that doesn't even exist in security
                focused communities.
       
                  charcircuit wrote 10 hours 39 min ago:
                  Yes, I do believe from the bottom of my heart the users trust
                  the operating systems they use. Apple and Google have done a
                  great job at security and privacy which is why it seems like
                  users don't care. It's like complaining why you have a system
                  administrator if the servers are never down. When things are
                  run well the average person seems ignorant of the problems.
       
                    michaelmrose wrote 10 hours 0 min ago:
                    They used Windows XP when it was a security nightmare and
                    many used it long after EOL. I just talked to someone whose
                    had 4 bank cards compromised in as many months who is
                    almost certainly doing something wrong.
       
                      charcircuit wrote 9 hours 56 min ago:
                      I'm talking about people's feelings. People can feel like
                      a Masterlock padlock is secure even if it may be trivial
                      to get past.
       
                    wolvoleo wrote 10 hours 20 min ago:
                    Google certainly hasn't done a great job on privacy.
                    Android devices leak so much information. [1] [2] About
                    Apple I just don't know enough because I haven't seriously
                    used them for years
                    
 (HTM)              [1]: https://arstechnica.com/information-technology/202...
 (HTM)              [2]: https://peabee.substack.com/p/everyone-knows-what-...
       
                      charcircuit wrote 10 hours 14 min ago:
                      Yet, in the big picture Google is doing a good enough job
                      that those information leaks have not caused them harm.
                      When you really zoom in you can find some issues, but the
                      real world impact of them is not big enough to influence
                      most consumers.
       
                        fc417fc802 wrote 9 hours 26 min ago:
                        What sort of hypothetical harm are you imagining here?
                        Suppose the information leaks were a serious issue to
                        me - what are my options? Switch to Apple? I doubt most
                        consumers are going to consider something like
                        postmarketos.
                        
                        The carriers in the US were caught selling e911
                        location data to pretty much whoever was willing to
                        pay. Did that hurt them? Not as far as I can tell,
                        largely because there is no alternative and (bizarrely)
                        such behavior isn't considered by our current
                        legislation to be a criminal act. Consumers are forced
                        to accept that they are simply along for the ride.
       
                          charcircuit wrote 9 hours 9 min ago:
                          Lets say that Google let anyone visit
                          google.com/photos?u=username to see all of the images
                          from their camera roll and left this online not
                          caring about the privacy implications.
                          
                          People would stop taking photos with their camera
                          that they didn't want to be public.
       
                            fc417fc802 wrote 9 hours 1 min ago:
                            People would presumably switch away from gcam and
                            the associated gallery app. Or they would simply
                            remove their google account from the phone. They
                            have realistic options in that case (albeit
                            somewhat downgraded in most cases).
                            
                            If Google did something egregious enough
                            legislation might actually get passed because
                            realistically, if public outcry doesn't convince
                            them to change direction, what other option is
                            available? At present it's that or switch to the
                            only other major player in town.
       
                    bigyabai wrote 10 hours 29 min ago:
                    > which is why it seems like users don't care.
                    
                    ...and not because, in truth, they don't care?
                    
                    How would we even know if people distrusted a company like
                    Microsoft or Meta? Both companies are so deeply-entrenched
                    that you can't avoid them no matter how you feel about
                    their privacy stance. The same goes for Apple and Google,
                    there is no "greener grass" alternative to protest the
                    surveillance of Push Notifications or vulnerability to
                    Pegasus malware.
       
                      charcircuit wrote 10 hours 24 min ago:
                      They would stop using them, or reduce what kinds of
                      things they do on them if they didn't trust them. No one
                      is forcing you to document your life on these palatforms.
       
                        bigyabai wrote 8 hours 39 min ago:
                        > They would stop using them
                        
                        Would they? Nobody that I know would.
       
                mzajc wrote 13 hours 5 min ago:
                > If the user doesn't trust an operating system, why would they
                use it.
                
                Because in the case of smartphones, there is realistically no
                other option.
                
                > For example if they don't trust it, they may avoid logging in
                to their bank on it.
                
                Except when the bank trusts the system that I don't (smartphone
                with Google Services or equivalent Apple junk installed), and
                doesn't trust the system that I do (desktop computer or
                degoogled smartphone), which is a very common scenario.
       
            wasmainiac wrote 15 hours 1 min ago:
            I’d like to think I’m buying the device, not a seat to use the
            device, at least if I do not want to use their software.
       
              Muromec wrote 14 hours 52 min ago:
              You can't have that with phones. You are always at the mercy of
              the hardware supplier and their trusted boot chain that starts
              with the actual phone processor (the one running GSM stuff, not
              user interface stuff). That one is always locked down and decides
              to boot you fancy android stuff.
              
              The fact that it's locked down and remotely killable is a feature
              that people pay for and regulators enforce from their side too.
              
              At the very best, the supplier plays nice and allows you to run
              your own applications, remove whatever crap they preinstalled and
              change to font face. If you are really lucky, you can choose to
              run practically useless linux distribution instead of practically
              useful linux distribution with their blessing. Blessing is a
              transient thing that can be revoked any time.
       
                direwolf20 wrote 12 hours 43 min ago:
                The GSM processor is often a separate chip. You may have read
                an article about the super spooky NSA backdoor processor that
                really controls your phone, but it's just a GSM processor.
                Connecting via PCIe may allow it to compromise the application
                processor if compromised itself, but so can a broadcom WiFi
                chip.
       
                rvba wrote 13 hours 44 min ago:
                Of course you can have that.
                
                The governments can ban this feature and ban companies from
                selling devices with that.
       
                RobotToaster wrote 13 hours 51 min ago:
                > You can't have that with phones.
                
                Why not?
                
                Obviously we don't have that.  But what stops an open firmware
                (or even open hardware) GSM modem being built?
       
                  fragmede wrote 13 hours 42 min ago:
                  There are some open firmware, or partially open firmware
                  projects, but they're more proof-of-concepts and not
                  popular/widely-used. The problem is the FCC or corresponding
                  local organization requires cell phones get regulatory
                  approval, and open firmware (where just anybody could just
                  download the source and modify a couple of numbers to violate
                  regulations) doesn't jive with that.
                  
 (HTM)            [1]: https://hackaday.com/2022/07/12/open-firmware-for-pi...
       
                the8472 wrote 14 hours 10 min ago:
                Not true on the pinephone, the modem is a peripheral module, so
                the boot chain does not start with it.
       
                  userbinator wrote 11 hours 22 min ago:
                  Nor the Mediatek platforms as far as I know (very familiar
                  with the MT65xx and MT67xx series; not sure about anything
                  newer or older, except MT62xx which also boots --- from NOR
                  flash --- the AP first.)
       
            pdpi wrote 15 hours 2 min ago:
            A discussion you don't see nearly enough of is that there is a
            fundamental tradeoff with hardware security features — every
            feature that you can use to secure your device can also be used by
            an adversary to keep control once they compromise you.
       
              digiown wrote 12 hours 25 min ago:
              In this case, the "adversary" evaluates to the manufacturer, and
              "once they compromise you" evaluates to "already". This is the
              case with most smartphones and similar devices that treats the
              user as a guest, rather than the owner.
              
              See also:
              
 (HTM)        [1]: https://github.com/zenfyrdev/bootloader-unlock-wall-of-s...
       
              izacus wrote 14 hours 40 min ago:
              Not only can, but inevitably is. Security folks - especially in
              mobile - are commonly useful idiots for introducing measures
              which are practically immediately coopted to take away users
              ability to control their device and modify it to serve them
              better. Every single time.
              
              We just had the Google side loading article here.
       
              Muromec wrote 14 hours 48 min ago:
              Fair enough, but so does your front door. Either thing is not
              smart enough to judge the legitimacy of ownership transitions.
       
                pdpi wrote 14 hours 39 min ago:
                Yeah, not disagreeing with you. It's just that, every time we
                have this discussion, we see comments like GP's rebutted by
                comments like yours, and vice versa.
                
                All I'm saying is that we have to acknowledge that both are
                true. And, if both are true, we need to have a serious
                conversation about who gets to choose the core used in our
                front door locks.
       
            piskov wrote 15 hours 35 min ago:
            > It's not even new, it was still there on p2k motorollas 25 years
            ago.
            
            I’m sure CIA was not founded after covid :-)
       
              obnauticus wrote 15 hours 27 min ago:
              Uhh…Wut?
       
                piskov wrote 4 hours 8 min ago:
                Let me remind you the gist of the parent comment:
                
                > So that’s how in an event of war US adversaries will be
                relieved of their devices
       
          rwmj wrote 15 hours 38 min ago:
          There's so many ways to do this, but a simpler method is to hide a
          small logic block (somewhere in the 10 billion transistors of your
          CPU) that detects a specific, long sequence of bits and invokes the
          kill switch.
       
        mycall wrote 15 hours 51 min ago:
        How hard is it to fix a fuse with a microscope and a steady hand?
       
          QuiEgo wrote 10 hours 49 min ago:
          Very hard. FIB is the only known way to do this but even then, that's
          the type of thing where you start with a pile of SoCs and expect to
          maybe get lucky with one in a hundred. A FIB machine is also millions
          of dollars.
       
          userbinator wrote 11 hours 12 min ago:
          You'll need at least an electron microscope... but defeating MCU
          readout protection using a FIB is actually a thing: [1] Costs are
          what you'd expect for something of this nature.
          
 (HTM)    [1]: https://www.eag.com/services/engineering/fib-circuit-edit-de...
       
        1a527dd5 wrote 15 hours 55 min ago:
        I look forward to the 1hr+ rant from Louis Rossmann.
       
          poizan42 wrote 15 hours 36 min ago:
          He has already made the video on this, but it is only 3:23:
          
 (HTM)    [1]: https://youtu.be/3AiRB5mvEsk?si=XapAHhHRJtssDI4F
       
        userbinator wrote 15 hours 58 min ago:
        I'm not sure if this is the case anymore, but many unbranded/generic
        Androids used to be completely unlocked by default (especially Mediatek
        SoCs) and nearly unbrickable, and that's what let the modding scene
        flourish. I believe they had efuses too, but software never used them.
       
        pengaru wrote 16 hours 0 min ago:
        Glad I didn't give these people any of my hard earned dollars.
       
        jijji wrote 16 hours 4 min ago:
        im sure that is not going to improve their sales numbers
       
        RugnirViking wrote 16 hours 12 min ago:
        isnt this just like... vandalism? nothing could give them the right to
        do this, they're damaging others property indescriminately.
       
        skeledrew wrote 16 hours 14 min ago:
        This is absolutely cracked. I've been with OnePlus since the One, also
        getting the 2, 6 and now I have the 12. Stuck with them all these years
        because I really respected their - original - take on device freedom. I
        really should've seen the writing on the wall given how much pain it is
        to update it in the first place, as I have the NA version which only
        officially allows carrier updates, and I don't live in NA (and even if
        I did I'd still not be tied to a carrier).
        
        Now I have to consider my device dead re updates, because if I haven't
        already gotten the killing update I'd rather avoid it. First thing I
        did was unlock the bootloader, and I intend to root/flash it at some
        point. Will be finding another brand whenever I'm ready to upgrade
        again.
       
          dataflow wrote 15 hours 59 min ago:
          This wasn't their only pain point. [1] Just get off OnePlus, you'll
          be happier.
          
 (HTM)    [1]: https://dontkillmyapp.com/oneplus
       
            literallywho wrote 9 hours 6 min ago:
            Fascinating. I've had a OnePlus 6 from 2018 until 2023 (all on
            stock software) and I've not had or noticed any issues like that.
       
            BeetleB wrote 15 hours 41 min ago:
            What are good alternatives that aren't Pixel?
       
              palata wrote 15 hours 38 min ago:
              For now, Pixels. I'm waiting to see what non-Pixel phone will be
              supported by GrapheneOS next, but this may take a while.
       
                wolvoleo wrote 14 hours 7 min ago:
                Yeah I'm surprised that they announced it but not the vendor
                name. I'm sure Google with their infinite resources already
                know which vendor it is. So who are they hiding it from?
       
        mystraline wrote 16 hours 15 min ago:
        Its high time we start challenging these sorts of actions as the
        "vandalization and sabotage at scale" that these attacks really are. I
        dont see how these aren't a direct violation of the CFAA, over millions
        of customer-owned hardware.
        
        They are no different than some shit ransomware, except there is no
        demand for money. However, there is a demonstrable proof of degradation
        and destruction of property in all these choices.
        
        Frankly, criminal AND civil penalties should be levied. Criminally, the
        C levels and boars of directors should all be in scope as to
        encouraging/allowing/requiring this behavior. RICO act as well, since
        this smells like a criminal conspiracy. Let them spend time in prison
        for mass destruction of property.
        
        Civally, start dissolving assets until the people are made whole with
        unbroken (and un-destroyed) hardware.
        
        The next shitty silly-con valley company thinks about running this scam
        of 'customer-bought but forever company owned', will think long and
        hard about the choices of their network and cloud.
       
          skeledrew wrote 16 hours 1 min ago:
          > no demand for money
          
          There is when the device becomes hard bricked and triggers an
          unnecessary need for a new one.
       
        charcircuit wrote 16 hours 25 min ago:
        This is industry standard. Flashing old updates that are insecure to
        bypass security is a legitimate attack vector that needs to be defended
        against. Ideally it would still be possible up recover from such a
        scenario by flashing the latest update.
       
          digiown wrote 12 hours 21 min ago:
          Standard?? The standard is for the upgrade to be refused or not boot
          until you flash a newer one, not to brick the phone permanently. It's
          not an "ideally" thing for the manufacturer to not intentionally
          brick your device you bought and paid for.
       
            charcircuit wrote 10 hours 18 min ago:
            >and you may damage your device permanently [1] They make it clear
            that this feature is unsupported and it's possible to mess things
            up. The reason why it's an ideal and not an expectation is that
            flashing alternate operating systems is done at one's own risk and
            is unsupported. They have already told the users that they bear no
            responsibility for what may go wrong if they flash the wrong thing
            on that device. Flashing incompatible operating systems to the
            device requires people to be careful and proper care to ensure
            compatibility before going through with flashing was not done.
            
 (HTM)      [1]: https://service.oneplus.com/us/search/search-detail?id=op5...
       
          orbital-decay wrote 13 hours 24 min ago:
          What's being attacked in this particular case?
       
            charcircuit wrote 12 hours 42 min ago:
            The phone. It's the same attacks that secure boot tries to protect
            against. The issue is that these old, vulnerable versions have a
            valid signature allowing them to be installed.
       
        zozbot234 wrote 16 hours 33 min ago:
        According to OP this does not disable bootloader unlocking in itself. 
        It makes the up-versioned devices incompatible with all previous custom
        ROMs, but it should be possible to develop new ROM releases that are
        fully compatible with current eFuse states and don't blow the eFuse
        themselves.
       
          pseudohadamard wrote 9 hours 20 min ago:
          I wonder, is there currently unpublished 0day on the SoC and they're
          forcing use of the latest firmware to ensure they're not vulnerable
          once the details become public?  That would be a reason for suddenly
          introducing this without explanation.
       
          palata wrote 15 hours 40 min ago:
          I understand that there is a nuance somewhere, but that's about it.
          
          Can you explain it in simpler terms such that an idiot like me can
          understand? Like what would an alternative OS have to do to be
          compatible with the "current eFuse states"?
       
            Muromec wrote 15 hours 28 min ago:
            People need to re-sign their releases and include the newer version
            of bootloader, more or less.
       
              zozbot234 wrote 14 hours 51 min ago:
              Yes, though noting that since the antirollback is apparently
              implemented by the bootloader itself on this Qualcomm SoC, this
              will blow the fuse on devices where the new version is installed,
              so the unofficial EDL-mode tools that the community seems to be
              most concerned about will still be unavailable, and users will
              still be unable to downgrade from the newer to older custom ROM
              builds.
       
                fc417fc802 wrote 7 hours 1 min ago:
                > unofficial EDL-mode tools
                
                The linked page seems to indicate that the EDL image is also
                vendor signed. Wouldn't that mean they're official?
                
                Unless I've misunderstood, the EDL image is tied to the same
                set of fuses as the XBL image so it's only useful to recover if
                the fuses don't get updated. Which seems like an outlandish
                design choice to me because it means that flashing a new XBL
                leaves you in a state where you lack the fallback tooling
                (hence the reports of people forced to replace the motherboard)
                and also that if there's anything wrong with the new XBL that
                doesn't manifest until after the stage where it blows the fuses
                then the vendor will have managed to irreversibly brick their
                own devices via an only slightly broken update.
       
                  zozbot234 wrote 6 hours 49 min ago:
                  EDL itself is a huge hack anyway, so who knows.  The
                  underlying issue is that the OS suppliers are forced to
                  bundle what is effectively the equivalent of a BIOS
                  (low-level firmware) with their image (because of the
                  underlying assumption that this is an embedded system where
                  there are no third-party OS suppliers), and the "BIOS" update
                  has to be made a one-way street when the older firmware has
                  vulnerabilities.  Newer EDL tools ought to become available
                  but they might not have the exact same capabilities as the
                  older ones, though they'll most likely be usable for basic
                  recovery.
       
                Muromec wrote 14 hours 44 min ago:
                Not being able to downgrade and using the debug tools was the
                exact point of doing this thing, as far as I understand.
       
        hypeatei wrote 16 hours 38 min ago:
        It's my first time hearing about this "eFuse" functionality in Qualcomm
        CPUs. Are there non-dystopian uses for this as a manufacturer?
       
          QuiEgo wrote 10 hours 43 min ago:
          Almost every modern SoC has efuse memory. For example, this is used
          for yield management - the SoC will have extra blocks of RAM and
          expect some % to be dead. At manufacturing time they will blow fuses
          to say which RAM cells tested bad.
       
          josephcsible wrote 14 hours 42 min ago:
          There are not. The entire premise of eFuses are that after you buy
          something, the manufacturer can still make changes that you can't
          ever undo.
       
          thesh4d0w wrote 16 hours 29 min ago:
          I use them in an esp32 to write a random password to each of my
          products, so when I sell them they can each have their own secure
          default wifi password while all using the same firmware.
       
            josephcsible wrote 14 hours 41 min ago:
            What advantage do you see from using eFuses and not some other way
            to store the password?
       
              thesh4d0w wrote 14 hours 37 min ago:
              This is the only way I could come up with that would allow an end
              user to do a full factory reset, and end up back in a known good
              secure state afterwards.
              
              Storing it in the firmware would mean every user has the same
              key. Storing it in eeprom means a factory reset will clear it.
              This allows me to ship hardware with the default key on a sticker
              on the side, and let's a non technical user reset it back to that
              if they need to.
              
              It gives you a 256bit block to work with -
              
 (HTM)        [1]: https://docs.espressif.com/projects/esp-idf/en/stable/es...
       
                josephcsible wrote 13 hours 46 min ago:
                But couldn't you also just set aside a bit of the EEPROM your
                factory reset skips, and accomplish the same thing?
       
          hexagonwin wrote 16 hours 33 min ago:
          Samsung uses this for their Knox security feature. The fuse gets
          broken in initial bootloader unlock, and all features related to Knox
          (Samsung Pay, Secure Folder, etc) gets disabled permanently even
          after reverting to stock firmware.
       
          Retr0id wrote 16 hours 35 min ago:
          eFuses are in most CPUs, often used for things like disabling
          hardware debug interfaces in production devices - and rollback
          prevention.
       
        jacquesm wrote 16 hours 39 min ago:
        This goes beyond the 'right to repair' to simply the right of
        ownership. These remote updates prove again and again that even though
        you paid for something you don't actually own it.
       
          veunes wrote 3 hours 29 min ago:
          When a remote update can irreversibly change hardware state,
          ownership becomes conditional
       
          bloomingeek wrote 15 hours 59 min ago:
          It's basically the same for our automobiles, just try to disable the
          "phone home" parts connected to the fin on the roof. Do we really own
          out cars if we can't stop the manufacturer from telling us we need to
          change our oil through email?
       
            reaperducer wrote 15 hours 50 min ago:
            Buy a Volvo. Then you can pop out the SIM card to disable the car's
            cellular communication. (On mine, located behind the mirror.)
            
            When you really need it, like to download maps into the satnav, you
            can connect it to your home WiFi, or tether via Bluetooth.
       
              fragmede wrote 14 hours 20 min ago:
              A phone without SIM can still be used to call emergency services
              (911/999/0118999 8819991197253). The situation we're discussing
              though is an attack by an extremely-APT. You really think not
              having the SIM card is going to do anything? If the cell phone
              hardware is powered up, it's available. All the APT has to do is
              have put their code into the baseband at some point, maybe at the
              Volvo factory when the car was programmed, and get the
              cooperation of a cell-phone tower, or use a Stingray to report
              where the car is when in range.
       
                jacquesm wrote 6 hours 15 min ago:
                > 8819991197253
                
                Which sadist decided that that is a good number for an
                emergency call?
       
                  Markoff wrote 3 hours 44 min ago:
                  it's actually 0118 999 881 999 119 7253, you missed the
                  beginning
                  
                  it's from IT Crowd
                  
 (HTM)            [1]: https://www.youtube.com/watch?v=HWc3WY3fuZU
       
                    jacquesm wrote 3 hours 1 min ago:
                    Hehe, that's hilarious, especially the way they add the 3.
       
              wasmainiac wrote 14 hours 53 min ago:
              Hahah, I just traded in 2023 (unrelated brand) for 2012 model
              since it was less of a computer. Computer systems in the newer
              car kept having faults that caused sporadic electrical issues
              workshops couldn’t fix. I just want my car to be a car and
              nothing else.
       
                AtheistOfFail wrote 13 hours 5 min ago:
                2005 Toyota Corolla.
       
                  jacquesm wrote 12 hours 10 min ago:
                  1997... and that's my last car. No way I'm going to be
                  driving around in a piece of spyware.
       
              g-b-r wrote 15 hours 25 min ago:
              Chinese-owned Volvo?
              
              OnePlus and other Chinese brands were modders-friendly until they
              suddenly weren't, I wouldn't rely on your car not getting more
              hostile at a certain point
       
                reaperducer wrote 11 hours 0 min ago:
                Chinese-owned Volvo?
                
                Shhh. Nobody tell him where his phone, computer, and vast
                majority of everything else in his house was made.
       
                daemin wrote 12 hours 7 min ago:
                There was a video by MKBHD where he said that every new phone
                manufacturer starts off being the hero and doing something
                different and consumer/user friendly before with growth and
                competition they evolve into just another mass market phone
                manufacturer. Realistically this is because they wouldn't be
                able to survive without being able to make and sell mass market
                phones. This has already happened to OnePlus back half a decade
                ago when they merged with Oppo, and it's arguably happened with
                ASUS as well when they cancelled the small form factor phone a
                couple years ago.
       
              Tarball10 wrote 15 hours 37 min ago:
              Until they switch to eSIM...
       
                blibble wrote 15 hours 29 min ago:
                cut the antenna
       
                  jeroenhd wrote 15 hours 24 min ago:
                  ... and get a Check Engine light+fault code for the built-in
                  emergency SOS feature, thereby making it unable to pass
                  vehicle inspection until you fix the antennae
       
                    cmxch wrote 11 hours 40 min ago:
                    Live in an inspection free state.
       
                    0xbadcafebee wrote 15 hours 17 min ago:
                    so either 1) disconnect it most of the time and reconnect
                    it for inspections, or 2) buy a dummy load RF terminator
                    matching the resistance of your antenna
       
          mystraline wrote 16 hours 11 min ago:
          Indeed.
          
          My ownership is proved by my receipt from the store I bought it from.
          
          This vandalization at scale is a CFAA violation. I'd also argue it is
          a fraudulent sale since not all rights were transferred at sale, and
          misrepresented a sale instead of an indefinite rental.
          
          And its likely a RICO act, since the C levels and BOD likely knew
          and/or ordered it.
          
          And damn near everything's wire fraud.
          
          But if anybody does manage to take them to court and win, what would
          we see? A $10 voucher for the next Oneplus phone? Like we'd buy
          another.
       
            dataflow wrote 15 hours 55 min ago:
            As far as legal arguments go, I imagine their first counter would
            be that you agreed to the update, so it's on you.
       
              mystraline wrote 15 hours 42 min ago:
              A forced update or continual loop of "yes" or "later" is not
              consent. The fact that there is no "No" option shows that.
              
              Fabricated or fake consent, or worse, forced automated updates,
              indicates that the company is the owner and exerting
              ownership-level control. Thus the sale was fraudulently conducted
              as a sale but is really an indefinite rental.
       
                ndriscoll wrote 12 hours 29 min ago:
                It Is not an indefinite rental. A sale can't be
                "misrepresented". It is a blatant CFAA violation. They are
                accessing your computer, modifying its configuration, and
                exfiltrating your private data without your authorization.
                
                If I buy a used vehicle for example, I have exactly zero
                relationship with the manufacturer. I never agree to anything
                at all with them. I turn the car on and it goes. They do not
                have any authorization to touch anything.
                
                We shouldn't confuse what's happening here. The engineers
                working on these systems that access people's computers without
                authorization should absolutely be in prison right alongside
                the executives that allowed or pushed for it. They know exactly
                what they're doing.
       
                  inkyoto wrote 11 hours 12 min ago:
                  > If I buy a used vehicle for example, I have exactly zero
                  relationship with the manufacturer. I never agree to anything
                  at all with them. I turn the car on and it goes. They do not
                  have any authorization to touch anything.
                  
                  Generally speaking and most of the time, yes; however, there
                  are a few caveats. The following uses common law – to
                  narrow the scope of the discussion down.
                  
                  As a matter of property, the second-hand purchaser owns the
                  chattel. The manufacturer has no general residual right(s) to
                  «touch» the car merely because it made it. Common law sets
                  a high bar against unauthorised interference.
                  
                  The manufacturer still owes duties to foreseeable users – a
                  law-imposed duty relationship in tort (and often statute)
                  concerning safety, defects, warnings, and misrepresentations.
                  This is a unidirectional relationship – from the
                  manufacturer to the car owner and covers product safety,
                  recalls, negligence (on the manufacturer's behalf) and alike
                  – irrespective of whether it was a first- or second-hand
                  purchase.
                  
                  One caveat is that if the purchased second-hand car has the
                  residual warranty period left, and the second-hand buyer
                  desires that the warranty be transferred to them, a
                  time-limited, owner-to-manufacturer relationship will exist.
                  The buyer, of course, has no obligation to accept the
                  warranty transfer, and they may choose to forgo the remaining
                  warranty.
                  
                  The second caveat is that manufacturers have tried
                  (successfully or not – depends on the jurisdiction) to
                  assert that the buyer (first- or second-hand) owns the
                  hardware (the rust bucket), and users (the owners) receive a
                  licence to use the software – and not infrequently with
                  strings attached (conditions, restrictions, updates and
                  account terms).
                  
                  Under common law, however, even if a software licence exists,
                  the manufacturer does not automatically get a free-standing
                  right to remotely alter the vehicle whenever they wish. Any
                  such right has to come from a valid contractual arrangement,
                  a statutory power, or the consent, privity still works and
                  requires a consent – all of which weakens the
                  manufacturer's legal standing.
                  
                  Lastly, depending on the jurisdication, the manufacturer can
                  even be sued for installing an OTA update on the basis of the
                  car being a computer on wheels, and the OTA update being an
                  event of unauthorised access to the computer and its data,
                  which is oftenimes a criminal offence. This hinges on the
                  fact that the second-hand buyer has not entered into a
                  consentual relationship with the manufacturer after the
                  purchase.
                  
                  A bit of a lengthy write-up but legal stuff is always a
                  fuster cluck and a rabit hole of nitpicking and nuances.
       
                    dataflow wrote 6 hours 31 min ago:
                    I don't really understand the legal arguments here:
                    
                    > the manufacturer can even be sued [...] This hinges on
                    the fact that the second-hand buyer has not entered into a
                    consentual relationship with the manufacturer after the
                    purchase.
                    
                    Wait, but the first owner (presumably, for the sake of
                    argument) agreed to this. Why isn't it the first owner's
                    fault for not disclosing it to the second owner? Shouldn't
                    they be sued instead? How is a manufacturer held
                    responsible for an agreement between parties that they
                    could not possibly be expected to have knowledge of?
       
                      inkyoto wrote 21 min ago:
                      Because common law is not a general «duty to disclose
                      everything» bludgeon for ordinary used-goods sales, and
                      the «why not sue the first owner» argument can only
                      work in narrow fact patterns.
                      
                      For example, if the first owner actively misrepresented
                      the position (for example, they said «no remote access,
                      no subscriptions, no tracking» when they knew the
                      opposite), the second owner might have a
                      misrepresentation claim against the first owner. But that
                      is pretty much where the buck stops.
                      
                      > «How can a manufacturer be liable for an agreement it
                      cannot know about?».
                      
                      That is not the right framing. The manufacturer is not
                      being held liable for «an agreement between the first
                      owner and the second owner». The manufacturer is being
                      held liable for its own conduct (access/modification by
                      virtue of an OTA update) without authorisation from the
                      _current_ rights-holder because liability follows the
                      actor.
                      
                      It happens because, under common law, 1) the first
                      owner’s consent does not automatically bind the second
                      owner, 2) consent does not normally run with the asset,
                      and 3) a «new contract with the second owner» does not
                      arise automatically on resale. It arises only if the
                      second owner consciously assents to manufacturer terms
                      (or if a statute creates obligations regardless of
                      assent).
                      
                      So the manufacturer is responsible because it is the
                      party _acting_. If the manufacturer accesses/modifies
                      without a valid basis extending to the current owner or
                      user, it owns that risk.
                      
                      I am not saying that «every unwanted OTA update is a
                      crime». All I am saying is that the legal system has a
                      concept of «unauthorised modification/access», and the
                      contention is over whether the access or modification was
                      authorised or not.
       
                    jacquesm wrote 8 hours 51 min ago:
                    This is the kind of nitpicking that I love to see on HN, it
                    is establishes the boundaries of the relationship between
                    manufacturers and owners and tries to lay bare the need for
                    (informed) consent and what the legal basis for that is.
       
            amelius wrote 15 hours 57 min ago:
            Their defense would probably be like: "you clicked Yes on the EULA
            form."
       
        tripdout wrote 16 hours 39 min ago:
        > When the device powers on, the Primary Boot Loader in the processor's
        ROM loads and verifies the eXtensible Boot Loader (XBL). XBL reads the
        current anti-rollback version from the Qfprom fuses and compares it
        against the firmware's embedded version number. If the firmware version
        is lower than the fuse value, boot is rejected. When newer firmware
        successfully boots, the bootloader issues commands through Qualcomm's
        TrustZone to blow additional fuses, permanently recording the new
        minimum version
        
        What exactly is it comparing? What is the “firmware embedded version
        number”? With an unlocked bootloader you can flash boot and super
        (system, vendor, etc) partitions, but I must be missing something
        because it seems like this would be bypassable.
        
        It does say
        
        > Custom ROMs package firmware components from the stock firmware they
        were built against. If a user's device has been updated to a fused
        firmware version & they flash a custom ROM built against older
        firmware, the anti-rollback mechanism triggers immediately.
        
        and I know custom ROMs will often say “make sure you flash stock
        version x.y beforehand” to ensure you’re on the right firmware, but
        I’m not sure what partitions that actually refers to (and it’s not
        the same as vendor blobs), or how much work it is to either build a
        custom ROM against a newer firmware or patch the (hundreds of) vendor
        blobs.
       
          ARob109 wrote 15 hours 18 min ago:
          Firmware (XBL and other non OS components) are versioned with anti
          rollback values. If the version is less than the version burned into
          the fuses the firmware is rejected. The “boot” partition is
          typically the Linux kernel. Android Verified Boot loads and hashes
          the kernel image and compares it to the expected hash in the vbmeta
          partition. The signature of the hash of the entire vbmeta metadata is
          compared to a public key coded into the secondary boot loader
          (typically abl (fastboot before fastbootd was done in user space to
          support super partitions))
          
          The abl firmware contains an anti rollback version that is checked
          with the eFuse version.
          
          The super partition is a bunch of lvm logical partitions on top of a
          single physical partition. Of these, is the main root filesystem
          which is mounted read only and protected with dm-verity device
          mapping. The root hash of this verity rootfs is also stored in the
          signed vbmeta.
          
          Android Verified Boot also has an anti rollback feature. The vbmeta
          partition is versioned and the minimum version value is stored
          cryptographically in a special flash partition called the Replay
          Protected Memory Block (rpmb). This prevents rollback of boot and
          super as vbmeta itself cannot be rolled back.
       
          Muromec wrote 15 hours 19 min ago:
          >What exactly is it comparing? What is the “firmware embedded
          version number”? With an unlocked bootloader you can flash boot and
          super (system, vendor, etc) partitions, but I must be missing
          something because it seems like this would be bypassable.
          
          This doesn't make sense unless the secondary boot is signed and there
          is a version somewhere in signed metadata. Primary boot checks the
          signature, reads the version of secondary boot and loads it only if
          the version it's not lower than what write-once memory (fuse)
          requires.
          
          If you can self-sign or disable signature, then you can do whatever
          boot you want, as long as it's metadata satisfies the version.
       
        bflesch wrote 16 hours 39 min ago:
        How likely is it that such software-activated fuse-based kill switches
        are built into iPhones? Any insights?
       
          QuiEgo wrote 10 hours 47 min ago:
          100%, if you steal a phone from the Apple store they just remote
          brick it.
       
            QuiEgo wrote 10 hours 19 min ago:
            Example:
            
 (HTM)      [1]: https://www.techspot.com/news/108318-stolen-iphones-disabl...
       
          izacus wrote 14 hours 36 min ago:
          Apple has been doing that since forever and will remotely kill switch
          devices so they need to be destroyed instead of reused: [1] Millions
          of fully working apple devices are destroyed because of that even -
          Apple won't unlock them even with proof of ownership.
          
 (HTM)    [1]: https://fighttorepair.substack.com/p/activation-locks-send-w...
       
          Muromec wrote 15 hours 10 min ago:
          It's there on all phones since forever lol. Apple can ship an update
          that adds "update without asking for confirmation" tomorrow and then
          ship another one that shows nothing but a middle finger on boot and
          you would not be able to do anything, including downgrading back.
       
          mort96 wrote 16 hours 13 min ago:
          So this article isn't about a kill switch, just blocking downgrades
          and custom ROMs.
          
          But to answer your question: we know iPhones have a foolproof kill
          switch, it's a feature. Just mark your device as lost in Find My and
          it'll be locked until someone can provide your login details.
          Assuming it requires logging in to your Apple account (which it does,
          AFAIK; I don't think logging in to a local account is enough), this
          is the same as a remote kill switch; Apple could simply make a device
          enter this locked-down state and then tweak their server systems to
          deny logins.
       
          Retr0id wrote 16 hours 33 min ago:
          The M-series CPUs found in iPads (which cannot boot custom payloads)
          are the same as the M-series CPUs found in Macbooks (which can boot
          custom payloads) - just with different fuses pre-burnt during
          manufacturing.
          
          Pre-prod (etc.) devices will also have different fuses burnt.
       
          hexagonwin wrote 16 hours 34 min ago:
          iPhones already cannot be downgraded, they can only install OS
          versions signed by apple during the install time. (search SHSH blobs)
          They also can't run unsigned IPA files (apps). Not sure if they have
          a physical fuse, but it's not much different.
       
            hoistbypetard wrote 16 hours 13 min ago:
            The significant difference is that if it were placed into DFU mode
            and connected to an appropriate device that had access to
            appropriately signed things, it could be "unbricked" without
            replacing the mainboard.
       
              hexagonwin wrote 15 hours 39 min ago:
              true, but I believe these bricked oneplus devices can also be
              revived from 9008 (EDL) if they can find the qualcomm firehorse
              loader file.
       
          jacquesm wrote 16 hours 36 min ago:
          I'd say for commercial hardware it is a near certainty even if you
          won't ever know until it is much too late.
          
          Realize that many of these manufacturers sell their hardware in and
          employ companies in highly policed societies. Just the fact that they
          are allowed to continue to operate implies that they are playing ball
          and may well have to perform a couple of favors. And that's assuming
          they are fully aware of what they are shipping, which may not be
          always the case.
          
          I don't think it is a bad model at all to consider any cell phone to
          be compromised in multiple ways even though you don't have hard
          proof.
       
        syntaxing wrote 16 hours 40 min ago:
        OnePlus has pretty much become irrelevant since Carl Pei left the
        company. Its more or less just a rebranded Oppo nowadays. I'm not an
        android user anymore but I'm rooting for his new(ish) Nothing company.
        Hopefully it carries the torch for the old OnePlus feel.
       
          gertrunde wrote 5 hours 52 min ago:
          Yup - and worse than that too.
          
          In the last week or two it's been rumoured that Oppo are pulling the
          plug on OnePlus, and are going to wind up the brand entirely.
          (Although it may cling on in certain markets, like India).
       
          opan wrote 16 hours 5 min ago:
          They consistently have allowed bootloader unlocking without extra
          fuss and have had good LineageOS support. That is their main appeal,
          IMO. Nothing phones had no LineageOS support until recently (spacewar
          is now supported, unsure about other models), and it's not clear if
          there's enough of a community/following to keep putting LineageOS on
          them. I do not want any phone where I'm stuck with the stock ROM.
       
            zozbot234 wrote 15 hours 56 min ago:
            Nothing phones also allow seamless bootloader unlocking, just like
            OnePlus.  There's been some rumors that OnePlus might be about to
            exit the market altogether, if so Nothing will probably expand into
            their niche and beyond their current approach based on "unique"
            design.
       
          skeledrew wrote 16 hours 7 min ago:
          I've been with OnePlus since the beginning, and am not at all
          impressed by the Nothing. Primary missing feature which I've come to
          depend on, off screen gestures, is missing. And the device just comes
          across as foreign in general; makes me think of the iPhone, which is
          not something I want to think of.
       
          Raed667 wrote 16 hours 30 min ago:
          As an early OnePlus user (1, 3, 5, 7, 13) i find myself unimpressed
          with what Nothing is proposing, feels more like a design exercise
          than a flagship killer
       
        scbzzzzz wrote 16 hours 41 min ago:
        What do OnePlus gain from this? Can someone explain me what are the
        advantages of OnePlus doing all this?
        A failed update resulting in motherboard replacement? More money, more
        shareholders are happy?
        
        I still sometimes ponder if oneplus green line fiasco is a failed
        hardware fuse type thing that got accidentally triggered during
        software update. (Insert I can't prove meme here).
       
          jeroenhd wrote 15 hours 33 min ago:
          Their low-level bootloader code contains a vulnerability that allows
          an attacker with physical access to boot an OS of their choice.
          
          Android's normal bootloader unlock procedure allows for doing so, but
          ensures that the data partition (or the encryption keys therefore)
          are wiped so that a border guard at the airport can't just Cellebrite
          the phone open.
          
          Without downgrade protection, the low-level recovery protocol built
          into Qualcomm chips would permit the attacker to load an old,
          vulnerable version of the software, which has been properly signed
          and everything, and still exploit it. By preventing downgrades
          through eFuses, this avenue of attack can be prevented.
          
          This does not actually prevent running custom ROMs, necessarily. This
          does prevent older custom ROMs. Custom ROMs developed with the new
          bootloader/firmware/etc should still boot fine.
          
          This is why the linked article states:
          
          > The community recommendation is that users who have updated should
          not flash any custom ROM until developers explicitly announce support
          for fused devices with the new firmware base.
          
          Once ROM developers update their ROMs, the custom ROM situation
          should be fine again.
       
            Snoozus wrote 10 hours 31 min ago:
            thank you for this, I have a follow up question:
            Now an attacker can not install an old, vulnerable version.
            But couldn't they just install a new, vulnerable version?
            Is there something that enforces encryption key deletion in one
            case and not the other?
       
              jeroenhd wrote 5 hours 0 min ago:
              AFAIK the signature mechanism hasn't been defeated, so the
              attacker can only load software signed by the factory keys.
              
              Which includes old, vulnerable versions and all patched, newer
              versions. By burning in the minimum version, the old code now
              refuses to boot before it can be exploited.
              
              This is standard practice for low-level bootloader attacks
              against things like consoles and some other phone brands.
       
            g947o wrote 13 hours 8 min ago:
            That makes sense, but how would an attacker flash an older version
            of the firmware in the first place? Don't you need developer
            options and unlocking + debugging enabled?
       
              jeroenhd wrote 5 hours 21 min ago:
              Qualcomm phones come with a special mode ( [1] ) that allows
              devices to get unbricked even after you break the normal
              user-updatable "bootloader" on flash completely.
              
              This feature doesn't allow unlocking the bootloader (as in,
              execute a custom ROM), it's designed to install factory-signed
              code. However, using it to "restore" an old, vulnerable factory
              code would obviously cause issues.
              
 (HTM)        [1]: https://en.wikipedia.org/wiki/Qualcomm_EDL_mode
       
              QuiEgo wrote 11 hours 7 min ago:
              Open the case and pogo pin on a flash programmer directly to the
              pins of the flash chip.
              
              Sophisticated actors (think state-level actors like a border
              agent who insists on taking your phone to a back room for
              "inspection" while you wait at customs) can and will develop
              specialized tooling to help them do this very quickly.
       
          drnick1 wrote 16 hours 0 min ago:
          > What do OnePlus gain from this? Can someone explain me what are the
          advantages of OnePlus doing all this?
          
          They don't want the hardware to be under your control. In the mind of
          tech executives, selling hardware does not make enough money, the
          user must stay captive to the stock OS where "software as a service"
          can be sold, and data about the user can be extracted.
       
            zb3 wrote 14 hours 19 min ago:
            Note that Google also forces this indirectly via their
            "certification" - if the device doesn't have unremovable AVB
            (requires qualcomm secure boot fuse to be blown) then it's not even
            allowed to say the device runs Android.. if you see "Android™"
            then it means secure boot is set up and you don't have the keys,
            you can't set up your own, so you don't really own the SoC you paid
            for..
       
              subscribed wrote 11 hours 21 min ago:
              I don't think it's accurate.
              
              Specifically GrapheneOS on Pixels signs their releases with their
              own keys. And with the rollback protection without blowing out
              any fuses.
       
                zb3 wrote 11 hours 12 min ago:
                I was talking about different keys and different fuses. I know
                about "avb_custom_key" (provisioned by GrapheneOS), but all
                this AVB is handled by abl/trustzone and I can't modify those
                because those need to be signed with keys that I don't own.
                
                I know that all these restrictions might make sense for the
                average user who wants a secure phone.. but I want an
                insecure-but-fully-hackable one.
       
            jeroenhd wrote 15 hours 31 min ago:
            A bit overdramatic, isn't it? Custom ROMs designed for the new
            firmware revisions still work fine. Only older ROMs with
            potentially vulnerable bootloader code cause bricking risks.
            
            Give ROM developers a few weeks and you can boot your favourite
            custom ROMs again.
       
              ddtaylor wrote 15 hours 5 min ago:
              Not really dramatic IMO. Basically mirrors everything we have
              seen in other industries like gaming consoles, etc. that have
              destroyed ownership over time in favor of "service models"
              instead.
       
                wolvoleo wrote 14 hours 9 min ago:
                And now governments are starting to take advantage of that loss
                of control by demanding surveillance tech like chatcontrol and
                other backdoors.
       
            palata wrote 15 hours 42 min ago:
            > In the mind of tech executives
            
            To be fair, they are right: the vast majority of users don't give a
            damn. Unfortunately I do.
       
              ddtaylor wrote 15 hours 4 min ago:
              Sure if you want to compete against Google or Samsung. Maybe that
              is the plan that one plus has. My understanding was that they
              were going after a different Market of phone users that might
              want a little bit more otherwise why not just go with one of the
              other people that will screw you just as hard for less.
       
          rvnx wrote 16 hours 35 min ago:
          It is the same concept on an iPhone, you have 7 days to downgrade,
          then it is permanently impossible. Not for technical reasons, but
          because of an arbitrary lock (achieved through signature).
          
          OnePlus just chose the hardware way, versus Apple the signature way
          
          Whether for OnePlus or Apple, there should definitively be a way to
          let users sign and run the operating system of their choice, like any
          other software.
          
          (still hating this iOS 26, and the fact that even after losing all my
          data and downgrading back iOS 18 it refused to re-sync my Apple Watch
          until iOS 26 was installed again, shitty company policy)
       
            Muromec wrote 15 hours 30 min ago:
            > Not for technical reasons, but because of an arbitrary lock
            (achieved through signature).
            
            There is a good reason to prevent downgrades -- older versions have
            CVEs and some are actually exploitable.
       
              rvnx wrote 3 hours 13 min ago:
              and ? this should prevent you from deciding the level of risk or
              even installing forks of that OS (that can also write fixes, even
              without source-code by patching binaries) ?
       
          TomatoCo wrote 16 hours 35 min ago:
          My understanding is there was a bug that let you wipe and re-enable a
          phone that had been disabled due to theft. This prevents a downgrade
          attack. It's in OnePlus's interest to make their phones less
          appealing for theft, or, in their interest to comply with
          requirements to be disableable from carriers, Google, etc.
       
            HiPhish wrote 15 hours 31 min ago:
            > It's in OnePlus's interest to make their phones less appealing
            for theft,
            
            I don't believe for a second that this benefits phone owners in any
            way. A thief is not going to sit there and do research on your
            phone model before he steals it. He's going to steal whatever he
            can and then figure out what to do with it.
       
              TomatoCo wrote 14 hours 6 min ago:
              Which is why I mentioned that carriers or Google might have that
              as a requirement for partnering with them. iPhones are rarely
              stolen these days because there's no resale market for them (to
              the detriment of third party repairs). It behooves large market
              players, like Google or carriers, to create the same perception
              for Android phones.
              
              Thieves don't do that research to specific models. Manufacturers
              don't like it if their competitors' models are easy to hawk on
              grey markets because that means their phones get stolen, too.
       
              lotu wrote 15 hours 24 min ago:
              Yes thieves do, research on which phones to steal.  Just not
              online more in personal talking with their network of
              lawbreakers.  In short a thief is going to have a fence, and that
              person is going to know all about what phones can and cannot be
              resold.
       
              lxgr wrote 15 hours 26 min ago:
              It actually seems to work pretty well for iPhones.
              
              Thieves these days seem to really be struggling to even use them
              for parts, since these are also largely Apple DRMed, and are
              often resorting to threatening the previous owner to remove the
              activation lock remotely.
              
              Of course theft often isn't preceded by a diligent cost-benefit
              analysis, but once there's a critical mass of unusable – even
              for parts – stolen phones, I believe it can make a difference.
       
            Zigurd wrote 15 hours 59 min ago:
            Carriers can check a registry of stolen phone IMEIs and block them
            from their networks.
       
              okanat wrote 3 hours 14 min ago:
              With vulnerable FW, you can change IMEIs. Hence this kind of
              rollback prevention updates.
       
              segmondy wrote 15 hours 38 min ago:
              right, but the stolen phones get sold in other countries where
              the carriers don't care if the phone was stolen but care that
              someone is spending money on their service.
       
                rvba wrote 13 hours 41 min ago:
                And we cant own our phones due to that?
       
              gsich wrote 15 hours 54 min ago:
              I have never seen this happen.
              
              I have however experienced that a ISP will write to you because
              you have a faulty modem (some Huawei device) and asks you to not
              use it anymore.
       
                ddtaylor wrote 15 hours 1 min ago:
                I the lines between IMEI banning or blacklisting and the modern
                unlocking techniques they use have been blurred a little bit
                and so some carriers and some manufacturers don't really want
                to do or spend time doing the IMEI stuff and would prefer to
                just handle it all via their own unlocking and locking
                mechanisms.
       
                TehCorwiz wrote 15 hours 51 min ago:
                Visit eBay and search for "blocked IMEI" or variants. There are
                plenty of used phones which are IMEI locked due to either:
                reported lost, reported stolen, failed to make payments, etc.
       
                  gsich wrote 15 hours 49 min ago:
                  All offers seem to be from the US.
       
              reaperducer wrote 15 hours 54 min ago:
              There is a surprising number of carriers in the world that don't
              care if you're using a stolen phone.
              
              Not surprisingly, stolen phones tend to end up in those
              locations.
       
            wnevets wrote 16 hours 16 min ago:
            > My understanding is there was a bug that let you wipe and
            re-enable a phone that had been disabled due to theft. This
            prevents a downgrade attack.
            
            This makes sense and much less dystopia than some of the other
            commenters are suggesting.
       
              userbinator wrote 15 hours 57 min ago:
              That's even more dystopian.
       
            scbzzzzz wrote 16 hours 25 min ago:
            Make perfect sense, Thanks kind stranger. Hope it is the reason and
            not some corporate greed. It on me, lately my thoughts are
            defaulted towards corporates sabotaging consumers. I need to work
            on it.
            
            The effects on custom os community is causing me worried ( I am
            still rocking my oneplus 7t with crdroid and oneplus used to most
            geek friendly)
            Now I am wondering if there are other ways they could achieved the
            same without blowing a fuse or be more transparent about this.
       
              itsdesmond wrote 16 hours 3 min ago:
              > It on me, lately my thoughts are defaulted towards corporates
              sabotaging consumers. I need to work on it.
              
              You absolutely do not, this is an extremely healthy starting
              position for evaluating a corporations behavior. Any benefit you
              receive is incidental, if they made more money by worsening your
              experience they would.
       
              cess11 wrote 16 hours 9 min ago:
              As I understand it, this is a similar thing on Samsung handhelds:
              
 (HTM)        [1]: https://en.wikipedia.org/wiki/Samsung_Knox
       
              zozbot234 wrote 16 hours 21 min ago:
              I don't think so. Blowing a fuse is just how the "no downgrades"
              policy for firmware is implemented. No different for other
              vendors actually, though the software usually warns you prior to
              installing an update that can't be manually rolled back.
       
                chasil wrote 16 hours 9 min ago:
                Are you quite certain?
                
                Google pushed a non-downgradable final update to the Pixel 6a.
                
                I was able to install Graphene on such a device. Lineage was
                advertised and completely incompatible, but some hinted it
                would work.
       
        WaitWaitWha wrote 16 hours 42 min ago:
        Is this for just one or several OnePlus models?
        
        If so, is this 'fuse' per-planned in the hardware?  My understanding is
        cell phones take 12 to 24 months from design to market.  so, initial
        deployment of the model where this OS can trigger the 'fuse' less one
        year is how far back the company decided to be ready to do this?
       
          happycube wrote 14 hours 37 min ago:
          This is in the Qualcomm SOC chip, so it's not something that has to
          be designed into the phone per se.
       
          Muromec wrote 15 hours 13 min ago:
          Fuses are there on all phones since 25+ years ago, on the real phone
          CPU side. With trusted boot and shit. Otherwise you could change IMEI
          left and right and it's a big no-no. What you interact with runs on
          the secondary CPU -- the fancy user interface with shiny buttons, but
          that firmware only starts if the main one lets it.
       
            userbinator wrote 11 hours 16 min ago:
            Otherwise you could change IMEI left and right and it's a big
            no-no.
            
            You can still change the IMEI on many phones if you know how to.
       
          TomatoCo wrote 16 hours 37 min ago:
          Lots of CPUs that have secure enclaves have a section of memory that
          can be written to only once. It's generally used for cryptographic
          keys, serials, etcetera. It's also frequently used like this.
       
        IshKebab wrote 16 hours 43 min ago:
        Why? What advantage do they get from this? I'm assuming it's not a good
        one but I'm struggling to see what it is at all.
       
          jeroenhd wrote 15 hours 27 min ago:
          They patched a low-level vulnerability in their boot process. Their
          phones' debug features would allow attackers to load an old,
          unpatched version of their (signed) software and exploit it if they
          didn't do some kind of downgrade prevention.
          
          Using eFuses is a popular way of implementing downgrade prevention,
          but also for permanently disabling debug flags/interfaces in
          production hardware.
          
          Some vendors (AMD) also use eFuses to permanently bond a CPU to a
          specific motherboard (think EPYC chips for certain enterprise
          vendors).
       
          hexagonwin wrote 16 hours 31 min ago:
          They can kill custom roms and force the latest vendor firmware. If
          they push a shitty update that slows down the phone or something,
          users have no choice other than buying a new device.
       
            bcraven wrote 16 hours 25 min ago:
            The article suggests custom roms can just be updated to be 'newer'
            than this.
            
            At the moment they're 'older' and would class as a rollback, which
            this fuse prevents.
       
        Retr0id wrote 16 hours 44 min ago:
        Blind speculation: I wonder if this is in some way related to DRM
        getting broken at a firmware level, leading to a choice being made
        between "users complain that they can't watch netflix" and "users
        complain that they can't install custom ROMs".
       
          dcdc123 wrote 16 hours 28 min ago:
          It was because a method was discovered to bypass the lockout of
          stolen devices.
       
            userbinator wrote 16 hours 3 min ago:
            In other words the same old boogeyman they always use to justify
            this crap.
       
              dcdc123 wrote 12 hours 11 min ago:
              From what I understand this does not prevent use of custom ROMs,
              it just means ROMs built before it was done will not work
              anymore. I assume they can re-package old versions to work with
              the new configuration, I am not entirely sure though. There are
              discussions elsewhere in this thread with more informed people.
       
                userbinator wrote 11 hours 25 min ago:
                it just means ROMs built before it was done will not work
                anymore.
                
                From the article:
                
                Any subsequent attempt to install older firmware results in a
                permanent "hard brick" - the device becomes unusable
                
                This implies that not only does an older custom ROM not work,
                but neither does attempting to recover by installing a newer
                ROM.
       
        raizer88 wrote 16 hours 56 min ago:
        You either die a hero, or live long enough to see yourself become the
        villain
       
          veunes wrote 3 hours 28 min ago:
          OnePlus is a textbook case of that quote
       
          Raed667 wrote 16 hours 42 min ago:
          I think the writing has been on the wall since they started their
          Nord line.
       
            Sebb767 wrote 16 hours 27 min ago:
            What was the issue with the Nord line?
       
              em-bee wrote 15 hours 55 min ago:
              yeah, i'd like to know that too. i have a oneplus nord running
              /e/OS and i am quite happy with it. in fact it's probably the
              best phone i had so far performance wise (i got it refurbished at
              a very good price which may have something to do with that
              though)
       
            alluro2 wrote 16 hours 31 min ago:
            Do you mean because the previous "flagship killer" company now
            needed a "flagship killer" sub-brand, since they could no longer be
            categorised as such?
       
              Raed667 wrote 16 hours 29 min ago:
              Exactly, why did they end up in a situation where they are making
              killers of their "main" phones ?
       
                zozbot234 wrote 16 hours 15 min ago:
                Because all midrange phones are "flagship killers" on a
                features basis now, flagships are just about the exclusivity. 
                The market has adapted and the term no longer makes much sense.
                 OnePlus still leads on custom ROM support though, e.g. no
                special codes or waiting times needed for unlocking the
                bootloader, it all works out of the box with standard commands.
       
                  microtonal wrote 1 hour 52 min ago:
                  OnePlus still leads on custom ROM support though, e.g. no
                  special codes or waiting times needed for unlocking the
                  bootloader, it all works out of the box with standard
                  commands.
                  
                  Google Pixel would like to have a word. Though they regressed
                  since they stopped shipping device trees in AOSP.
       
       
 (DIR) <- back to front page