[HN Gopher] Intel OEM Private Key Leak: A Blow to UEFI Secure Bo...
___________________________________________________________________
Intel OEM Private Key Leak: A Blow to UEFI Secure Boot Security
Author : transpute
Score : 377 points
Date : 2023-05-06 17:39 UTC (5 hours ago)
(HTM) web link (securityonline.info)
(TXT) w3m dump (securityonline.info)
| TheMagicHorsey wrote:
| Why do people think that these security schemes that are based on
| a "secure" central key are ever going to work in the long run.
| The most secret documents of the US state are leaked periodically
| by dumbasses, and yet the FBI thinks they can keep backdoor keys
| to all our devices secure.
|
| Give me a break.
| rolph wrote:
| wieghting in at 1.5TB, it should take a while for first seeds to
| appear.
| flangola7 wrote:
| A 1.5TB private key?
| rolph wrote:
| the entire leak was said to be 1.5TB
| inciampati wrote:
| The total exfiltrated data.
| capableweb wrote:
| How poor must your security be at such a big company for no one
| or no system to detect 1.5TB data being exfiltrated to a remote
| host? Especially supposedly extra sensitive data like private
| keys...
| teddyh wrote:
| Link?
| rolph wrote:
| first paragraph
|
| >>In April, MSI fell victim to a cyberattack perpetrated by
| the ransomware group Money Message, who successfully
| infiltrated MSI's internal systems and exfiltrated a
| staggering 1.5TB of data, predominantly comprising source
| code.<<
|
| https://securityonline.info/intel-oem-private-key-leak-a-
| blo...
| [deleted]
| discerning_ wrote:
| If these keys are leaked, they should be adopted by open source
| projects to disable secure boot.
| fowtowmowcow wrote:
| I'm not sure that they can of the key is proprietary to Intel.
| I think this would open up those projects to litigation.
| einarfd wrote:
| There seems to be a bit of a precedence with the AACS DVD
| encryption keys that got leaked (https://en.m.wikipedia.org/w
| iki/AACS_encryption_key_controve...), the suppression of that
| key. Seems to have failed, it was widely copied, and you can
| even find a copy of it on my link to Wikipedia.
| zapdrive wrote:
| It's just a string of characters.
| Xorlev wrote:
| Software, movies, music is just a string of bits.
|
| Using something leaked always carries some inherent risk.
| realusername wrote:
| The difference is that software and music are made by
| authors unlike keys, that's what makes them copyrightable
| brookst wrote:
| So are bomb threats and false advertising.
|
| I don't think "it's just characters" is a one-simple-trick.
| ok123456 wrote:
| you make a mathematical formula that generates the key.
| realusername wrote:
| > I'm not sure that they can of the key is proprietary to
| Intel. I think this would open up those projects to
| litigation
|
| Depends of the legislation.
|
| That's questionable in the US since the keys are 100% machine
| generated and thus not copyrightable.
|
| In most of the EU, it's clear though, there's interobability
| exceptions and those keys can be shared freely.
| iforgotpassword wrote:
| I think this is not general enough. What would be needed is the
| Microsoft secure boot private key so we can just sign EFI
| binaries and have them work everywhere without mucking around
| in the bios setup.
|
| Afaiu, this key is specific to certain generations of Intel
| CPUs.
| Arnavion wrote:
| Is there any platform using Intel CPUs with Boot Guard where
| Secure Boot can't already be disabled?
| NelsonMinar wrote:
| On one of my systems disabling secure boot also disables
| other aspects of the BIOS. I forget what, maybe use of the
| Intel graphics on the chip? It was severe enough I spent an
| hour figuring out how to make secure boot work instead.
| Arnavion wrote:
| Which system?
| meepmorp wrote:
| But secure boot is a good thing! I want my machines to verify
| what they're loading at boot time!
|
| I just want to specify the root of trust.
| yyyk wrote:
| There's mokutil to add your own key.
| ranger_danger wrote:
| Why would you want to disable secure boot? Personally I'd
| rather not have software able to modify my bootloader.
| AshamedCaptain wrote:
| Software can still modify the bootloader. Secure Boot does
| not protect against that. It just will complain on the next
| boot .... unless the replacement bootloader has been signed
| with the MS signature, the BIOS manufacturer signature, the
| OEM signature, or a bazillion other signatures.
|
| Even if you were to completely replace all of the signatures
| with your own, you are going to have to trust some of the
| MS/manufacturer ones (unless you replace all the
| manufacturer-signed firmware modules with your own).
| Arnavion wrote:
| >unless you replace all the manufacturer-signed firmware
| modules with your own
|
| ... of which there might not be any. Eg none of my half-
| dozen SB-using systems (desktops and laptops) have anything
| in the ESP other than the booloader and UKIs I put there.
| codedokode wrote:
| Isn't it good? Does leaked key mean that now owners of hardware
| will be able to read and modify the firmware, including IME, and
| check it for backdoors?
|
| Such keys should be in the hands of users, not Intel.
| QuiDortDine wrote:
| If there was something to leak, it was always going to. Just a
| matter of when. Pretending otherwise is just security theater.
| conradev wrote:
| Yeah, it is puzzling that the key was able to be leaked in
| the first place. The key should have been in an HSM.
| er4hn wrote:
| Same thing with Samsung and their key leak.
|
| Part of the blame, imo, lies with how clunky tools are at
| the lower levels. I've seen plenty of hardware based
| signing protocols that don't allow for key hierarchies.
|
| Higher level tools push this along as well. Hashicorp Vault
| also, last I checked, doesn't allow for being a front end
| to an HSM. You can store the master unlock key for a Vault
| in an HSM, but all of the keys Vault works with will still
| be in Vault, in memory.
| 19h wrote:
| Pfft, keys, schmeys. Real security is built on handshakes and
| backroom deals, not strong encryption.
| cassepipe wrote:
| Didn't get it
| Y_Y wrote:
| What's the cryptographic definition of a "backroom deal"?
| Can I do it with Ed25519?
| efitz wrote:
| No, but you can with the curves that the NSA proposed to
| NIST.
| guerrilla wrote:
| Yeah, don't depend on a permanent global conspiracy for your
| security. Someone always defects and accidents often happen
| long before that.
| henriquez wrote:
| It is not a conspiracy. Just like the iOS App Store it is
| for your own protection. There is no legitimate reason to
| run your own software on general purpose computing
| hardware.
| ChrisClark wrote:
| /s I hope. ;)
| brookst wrote:
| Doesn't really matter /a or not, it's a ridiculously
| reductive and extremist position either way.
|
| Security is about tradeoffs, most notably security vs
| convenience, but also many others.
|
| Anyone who suggests that their personal preferences in
| tradeoffs are not just universally correct but also the
| only reasonable position to hold is just silly.
| hammock wrote:
| That's the same argument that people use to support the
| second amendment (the people's right to bear arms)
| [deleted]
| Y_Y wrote:
| Hey, the second amendment says the right to bear arms
| shall not be infringed, it doesn't say it exists!
| aksss wrote:
| "keep and bear" :^)
| [deleted]
| brookst wrote:
| Is everything that is gong to fail eventually just useless
| theater? Like new cars Re just transport theater because they
| will have to be junked eventually?
|
| I agree that master private keys are bad security design, and
| we can and should do better. I'm just not willing to say that
| all past security value is retroactively nullified. That
| feels polemic more than realistic.
| htag wrote:
| There's a difference between temporary security and
| security theater.
|
| Real but temporary security -> This 2048 bit key you
| generated will be commercial grade protection until at
| least 2030. Sometime after that computers will be strong
| enough to brute force it. Do not store anything with this
| key that will still be highly sensitive in 7 years. It's
| possible the underlying algorithm is cracked, or a leap in
| quantum computers happen that will make the key obsolete
| sooner.
|
| Security theater -> All software running on this chip must
| be signed with our master key. Please trust all software we
| sign with this key, and no malicious party will have access
| to it. You are not allowed to run arbitrary software on
| your hardware because it is not signed with our key.
|
| In the first case, the security is real. You own the lock,
| you own the key, and you control the entire security
| process. In the second case, you neither own the lock, the
| key, and basically have limited access to your own
| hardware.
| hilbert42 wrote:
| Absolutely. My first thought was 'ah now I can modify my BIOS
| the way I want it'.
| tapoxi wrote:
| Realistically it means a lot more people are going to cheat in
| Valorant.
| shrimp_emoji wrote:
| Oh no! Here, please, backdoor my OS with a kernel anticheat
| -- anything that saves me from cheaters in the current bideo
| game of the month! D:
| juliusgeo wrote:
| Riot anti cheat is quite invasive but Valorant is a
| competitive ranked first person shooter, allowing cheaters
| violates the integrity of any ranking system of players,
| and that ranking system is one of the primary appeals of
| the game.
| CircleSpokes wrote:
| I honestly don't understand why people act like this.
| Wanting to be able to ensure firmware isn't maliciously
| modified is a good thing. Open firmware is also a good idea
| obviously but there has to be a way to ensure firmware is
| signed either by OEM or your own keys like secure boot.
|
| As for games, lots of people play games and want good
| anticheat. If you don't like that you don't have to play
| those games but no need to act like the way you are because
| other people want decent anticheat.
| codedokode wrote:
| > As for games, lots of people play games and want good
| anticheat
|
| Great, let's install a backdoor in every computer so that
| some people can play games and watch movies. No. Computer
| is a thing for computing numbers not a replacement for a
| TV.
| CircleSpokes wrote:
| I can't take people like you seriously. The anticheat
| isn't a backdoor. It doesn't ship with the operating
| system or come preinstalled in anyway. You opt into it
| when you play the game. Literally nothing is forcing you
| to use it or have it installed on your computer.
|
| I understand this is the internet and being super
| dramatic is part of it but can we please be for real for
| one moment?
| thomastjeffery wrote:
| 1. It doesn't actually work.
|
| 2. All it _actually_ does is keep users trapped in
| Windows. God forbid anyone actually use Linux, or even a
| VM!
|
| The only actually effective anti-cheat is the original:
| moderation.
|
| Now that users aren't able to host their own servers,
| they can't do moderation. Game studios don't want to do
| moderation themselves, so they keep trying (and failing)
| to replace it with automated anticheat systems.
| [deleted]
| kortilla wrote:
| >honestly don't understand why people act like this.
|
| Because it's social pressure to compromise your computer
| to a gaming company to get to play a game.
|
| People don't care about the anticheat on their computer,
| they want it foisted on everyone else who plays, which is
| a sucky proposition for privacy and security minded
| people.
|
| It's like advocating for the TSA to be controlling access
| to the grocery store because you want to feel safe there
| and don't mind the privacy violation.
| CircleSpokes wrote:
| >People don't care about the anticheat on their computer,
| they want it foisted on everyone else who plays, which is
| a sucky proposition for privacy and security minded
| people.
|
| No they want games without hackers. Which kernel based
| anticheats helps with. Can it also impact privacy and
| security? Yes no doubt but so can any program running on
| the computer even in userspace. Remember we are talking
| about kernel anticheats on windows lol.
|
| If you are really worried about it you could dual boot
| like many people. Either way this whole argument seems
| silly to me.
| charcircuit wrote:
| >to compromise your computer
|
| What do you mean by this? As the user you are intending
| to have the game and its anticheat run. Having to
| download and run a game on your computer isn't
| compromising your computer either. Maybe the only thing
| which doesn't give the game company power to run
| potentially malicious code on your machine is cloud
| gaming. That also solves the cheating problem at least.
| fafzv wrote:
| So don't play the game. Personally I want kernel level
| anticheats because they make it much harder to cheat in the
| game. I want to know that my opponents are not cheaters.
| That's something I don't have in CS:GO, a game ripe with
| cheaters, or TF2, a game ripe with bots. (Valve's usermode
| anticheat is absolutely useless)
| codedokode wrote:
| Make every player pay a deposit which is confiscated when
| they get caught cheating. Make servers with different
| deposit levels, so that people who really care about
| cheating pay over $1000 for example.
|
| Better than having keys which I cannot control on my
| computer. And I don't play games anyway.
| von_lohengramm wrote:
| Yet it's still pretty dang easy to bypass VGK and cheat
| in Valorant if you even slightly know what you're doing.
| Now you have the worst of both worlds. In theory, Valve's
| VACnet and Trust Factor are the ideal solutions, but in
| practice... not so much.
| fafzv wrote:
| How is VAC the ideal solution? It is weak even in theory.
| von_lohengramm wrote:
| VACnet, not VAC. Server-side ML model analyzing player
| actions influencing their Trust Factor (or just straight
| up banning in more egregious cases).
| charcircuit wrote:
| Wanting to play competitive games without cheaters is
| something that real users actually want and they get real
| value from. Your mockery of these people doesn't remove the
| value they get from being able to play without cheaters.
| codedokode wrote:
| Having encryption, keys and software that I cannot
| control because game makers and copyright owners want
| more profit is ridiculous.
| biosboiii wrote:
| I've read a lot of anti cheat RE in the past, seems like the
| cheater/modder people have found their way to the infosec
| community, can you elaborate on how this would accelerate
| Valorant cheating. Is their watchguard thing using some Intel
| feature?
| r1ch wrote:
| Their anti cheat is a kernel level driver and requires
| secure boot to make sure it loads before anything could
| potentially tamper with the system.
| jsheard wrote:
| Doesn't it only require secure boot on Windows 11? For
| now you can get around that requirement by simply staying
| on Windows 10, until they retire support for that.
| r1ch wrote:
| Yes, the mandatory TPM and secure boot only applies to
| Windows 11. I'm sure they're eager to drop Windows 10
| support as soon as they're able to.
| CircleSpokes wrote:
| You are correct. Secure boot is not required to play
| valorant on windows 10.
| hnthrowaway0315 wrote:
| Is there any tutorial that I can learn to do it? Should I
| Google "dump Intel firmware" or some other more specific ones?
| I'm going to do some research after going through my training
| this afternoon.
| mjg59 wrote:
| Nothing's prevented you from reading the firmware - this is a
| _signing_ key, not an encryption key. Multiple people have
| spent time reverse engineering the ME firmware, people have
| found bugs but no evidence of a backdoor.
| derefr wrote:
| In general, a bad thing for security. But is there any silver
| lining here related to jailbreaking proprietary Intel-based
| devices?
| Macha wrote:
| Or things like DRMed media systems
| hammock wrote:
| The leaked key means that now owners of hardware will be able
| to read and modify the firmware, including IME, and check it
| for backdoors
| yyyk wrote:
| Is there only one 'private key', or do they use an intermediate
| cert kind of setup where they could revoke MSI via a BIOS update?
| andromeduck wrote:
| You can always factory reset.
| SXX wrote:
| What factory reset? PC motherboards are not Android phones.
| When you flash new BIOS you can't get back to old one without
| flashing it and obviously new BIOS versions after accident
| will likely forbid flashing of older versions.
| pseudo0 wrote:
| On many modern motherboards there is a backup bios that you
| can boot into by shorting certain pins. This can be done in
| about 10-15 minutes by a person with a metal paperclip,
| some basic technical knowledge and instructions from a
| YouTube video. I don't think there is even a mechanism to
| update this backup version, it's just a "known good" bios
| shipped with the hardware so that a bad bios update does
| not brick the device.
|
| So even if they push an update, people can pretty easily
| downgrade to a vulnerable version if they want to.
| SXX wrote:
| I doubt there are actually many boards with dual bios.
| For a while Gigabyte had a lot of them, but for many
| years very many of their boards dont have this feature
| jnsaff2 wrote:
| Hah, funny, just this week finished the new Doctorow book Red
| Team Blues [0] which has a similar aspect.
|
| [0] https://craphound.com/category/redteamblues/
| Y_Y wrote:
| You'll need to have spare fuses to burn or something. All
| memory is volatile if you're motivated enough.
| wmf wrote:
| Yeah, Danny should never have bought those keys. Don't forget
| _Rainbows End_ either.
| guenthert wrote:
| Does that mean I can update my eol'd Chromebook Pixel with a
| current Linux distribution w/o disassembling it?
| surteen wrote:
| Chromebooks don't use UEFI, just coreboot
| sylware wrote:
| This "security" is "security" against you. Don't fool yourself.
| thathndude wrote:
| This was always a dumb idea. No different than a "master" TSA
| key. All it does is create a single point of failure.
| rektide wrote:
| Kind of/almost a good thing. More and more security processors
| seem to have irrevocable keys or other non-user setups. It's.
| Just. Not. OK.
|
| And more and more governments are making demands about
| decrypting users data on demand, about blowing up security for
| their own greedy needs. They have no idea the awful horrific
| mess, the buggy by design systems we get when we forgoe
| security for their paranoid snooping. This is such a textbook
| lesson. Alas that we need another blow to the face to remind
| ourselves.
| Woodi wrote:
| You are of course wrong ! Becouse everyone needs to buy
| new... everything every 2 years ! It's good for some :>
| scarface74 wrote:
| Do you actually have anything in your checked bag that you care
| if it gets stolen?
| rvba wrote:
| It was a genius idea - you cannot install Windows 11 on an old
| computer. So you need to buy a new one.
|
| Monopoly practice hidden as security.
| tredre3 wrote:
| This has nothing to do with TFA, you're thinking of the
| TPM2.0 which is unrelated to secure boot.
|
| Secure Boot is part of UEFI. TPM2.0 is used only by bitlocker
| (at least for the average person, enterprises do store other
| keys in it).
| chinabot wrote:
| 3 people can keep a secret if two of them are dead. Good luck
| with hundreds!
| throwawaymanbot wrote:
| [dead]
| teddyh wrote:
| The original links:
|
| Announcement:
| http://blogvl7tjyjvsfthobttze52w36wwiz34hrfcmorgvdzb6hikucb7...
| (link likely to be invalid over time, since the number seems to
| change)
|
| Link with the actual data:
| http://vkge4tbgo3kfc6n5lgjyvb7abjxp7wdnaumkh6xscyj4dceifieun...
|
| (Links found via https://www.ransomlook.io/group/money%20message)
| jokowueu wrote:
| GitHub link below
|
| https://github.com/binarly-io/SupplyChainAttacks/blob/main/M...
| teddyh wrote:
| IIUC, aren't those the _public_ keys, and not the actual,
| leaked, private keys?
| transpute wrote:
| Related thread,
| https://twitter.com/binarly_io/status/1654287041339998208
| csdvrx wrote:
| How can you scan your firmware on linux, without running an
| unknown payload, to know if you're affected?
|
| How can you test the leaked key say to edit your bios, resign
| it, then reflash it?
| transpute wrote:
| _> How can you scan your firmware on linux, without running
| an unknown payload, to know if you 're affected?_
|
| According to the Twitter thread above, you would upload the
| original OEM firmware for your device to Binarly's web portal
| at https://fwhunt.run. The firmware file matching your device
| could be obtained from the OEM's website, rather than the
| running system. I haven't tried this myself, don't know if it
| requires unpacking or pre-processing the OEM firmware file
| format.
| csdvrx wrote:
| They must be doing the equivalent of a grep.
|
| It seems safe, but I'd rather do that locally.
| transpute wrote:
| Maybe someone could add key manifest inspection to this
| OSS tool, https://fiedka.app.
|
| Hopefully Intel and OEMs will make official statements
| soon.
|
| If you're copying a firmware file from the OEM's website
| to Binarly's website, then receiving a text report, they
| would have an IP address, browser fingerprint and device
| model number, but little else.
| LZ2DMV wrote:
| From the linked article, I'm left with the impression that this
| is only a problem for MSI (and a few other vendors) devices.
|
| If Intel Boot Guard works by including a public key in a fuse in
| all CPUs from a set of series and now the corresponding private
| key is leaked, why isn't this a global problem? The same CPU with
| the same public key must be in every machine with an Intel CPU
| from these generations. What am I missing here?
| transpute wrote:
| In addition to the BootGuard public key, there is a chipset
| fuse with OEM configuration,
| https://www.securityweek.com/flawed-bios-implementations-lea...
|
| _> The boot chain uses an RSA public key (its hash is hard-
| coded inside the CPU) and an OEM private key. The OEM sets the
| final configuration and writes it to one-time-programmable
| Intel chipset fuses during the manufacturing process, thus
| making it almost impossible for an attacker to modify the BIOS
| without knowing the private part of the OEM Root Key. However,
| because some OEMs might fail to properly configure Intel Boot
| Guard, attackers could end up injecting code and permanently
| modifying BIOS.
|
| > At Black Hat 2017, security researcher Alex Matrosov
| presented some vulnerabilities in poor BIOS implementations,
| explaining that not all vendors enable the protections offered
| by modern hardware. Because of that, attackers could elevate
| privileges, bypass protections, and install rootkits, he
| explained._
|
| Some HP business devices don't use Intel BootGuard, because HP
| has their own proprietary solution for firmware integrity
| verification, https://news.ycombinator.com/item?id=35845073
| pmorici wrote:
| Why would MSI have Intel's private key? How private could it have
| been if they were handing it out to OEMs?
| yarg wrote:
| Unless and until we get to efficient homomorphic compute, these
| measures will only ever be security via obscurity.
| bawolff wrote:
| I don't see how homomorphic encryption is particularly
| applicable to secureboot.
| yarg wrote:
| You want to be able to deploy and execute code outside the
| control of whoever physically controls the machine.
|
| Either you implement it with security features hidden from
| the device holder, in which case it will always be broken
| eventually, or you guarantee the capabilities with
| mathematics - in which case a security break cannot happen
| even if the physical machine's description is completely
| public.
|
| There are certainly layers to this that I'm missing, but I
| think homomorphic compute is the only unbreakable answer to
| secure compute in general.
| dist-epoch wrote:
| > You want to be able to deploy and execute code outside
| the control of whoever physically controls the machine.
|
| Microsoft solved this problem on the latest Xbox. Many
| years after it was launched, it's still not jail-broken.
|
| They are now working on bringing that technology to regular
| Windows/PCs - Pluton.
| bawolff wrote:
| My understanding (which might be wrong, crypto is a complex
| topic and i am an amateur) is that homomorphic would hide
| the data being worked on from the algorithm working on it.
| Here we want to verify the (non-secret) algorithm has been
| approved (code signing) which we then run on non-secret
| data. I don't think homomorphic encryption can help with
| that since its kind of a different problem.
|
| The issue here, of the key holder leaking the key, also
| seems impossible to work around in general, since the
| requirements are: 1) there exists someone who can sign
| code. 2) that person cannot screw up (e.g. leak the key)
| and allow the wrong code to be signed. These are pretty
| contradictory requirements, that no amount of crypto can
| fix. Ultimately it is a social problem not a technical one;
| there is no full technical definition of misusing a key.
| There are things that can help - like HSMs, splitting the
| key between multiple parties, having better methods of
| revoking and replacing compromised keys (hard without
| network access and an unwillingness to brick old devices).
| Not the same domain, but AACS is an interesting example of
| a system somewhat resiliant to key compromise.
| yarg wrote:
| There's a good chance that I'm conflating some ideas
| here, but I think there might be a kernel of something
| that isn't completely useless.
|
| I'm not sure if it's possible (given that there's overlap
| with public key/private key encryption it may be), but I
| think that if you could produce a homomorphic computer
| capable of plain text export, this would be a resolvable
| problem.
| btilly wrote:
| I do not want malware authors to be able to run code on my
| machine outside of my control. That prevents me from
| knowing whether it is installed, what it is doing, or to
| have a way to get rid of it.
|
| Holomorphic encryption allows someone's interests to be
| secured. But I'm dubious that I'm the one who will actually
| benefit here.
| yarg wrote:
| Then don't run code from untrusted sources.
|
| This also has major implications for cloud compute.
| btilly wrote:
| That's not a realistic answer.
|
| Do you have any idea how much software is on the average
| consumer device, and how poorly equipped the average
| consumer is to determine its provenance let alone decide
| what is trustworthy?
|
| Not to mention that there are economic reasons to run
| untrusted software. For example no matter how little I
| trust Zoom and Slack, I don't have a job if I am not
| willing to run them.
| Sunspark wrote:
| I like the approach of having a dedicated PC and phone
| that is permitted to be riddled with remote-management
| and malware and used only for those purposes, and my own
| devices completely separate.
| Animats wrote:
| There's an upside to this. It can be used politically as an
| argument against backdoors for "lawful access"[1] to encrypted
| data.
|
| [1] https://www.fbi.gov/about/mission/lawful-access
| TheRealPomax wrote:
| You're going to have to explain how, grounded in a reality
| where politicians fundamentally do not understand the idea of
| security and have to make decisions based on who sounded the
| most confident when they argued complete nonsense.
| barnabee wrote:
| The upside is bigger than the downside, even if it has no
| immediate effect.
|
| There is no scheme that only gives good people access. We can
| and must hammer this home.
| tzs wrote:
| How so?
| smoldesu wrote:
| It's funny how blatantly the NSA doesn't care. They feign
| ignorance when asked or FOIA requested about it, but then also
| ask for a backdoor to opt-out of the Intel Management
| Engine[0]. It's like there's no coordinated effort to deny that
| it's an extreme vulnerability.
|
| [0]
| https://en.wikipedia.org/wiki/Intel_Management_Engine#%22Hig...
| voidfunc wrote:
| The argument doesn't matter because the federal government and
| politicians don't give a shit about facts
| hilbert42 wrote:
| Until their PCs get hacked and their medical and
| psychiatrists' notes about them become front page news.
| TheRealPomax wrote:
| Sorry, what? This literally happened, THIS YEAR, and not a
| single one cared beyond saying "oh no, this is terrible, if
| only there was something we could have done!"
|
| https://apnews.com/article/congress-data-breach-hack-
| identit...
| LadyCailin wrote:
| Alternatively: Stuff is going to be leaked anyways, so we
| might as well also make it easy for law enforcement to do
| their job.
| Guthur wrote:
| Their job is not what you think it is.
|
| It's to maintain the status quo no matter how corrupt or
| abhorrent it is. The word enforcement says it all.
| nvy wrote:
| Of course that's their job. The police force is not
| designed to be a vehicle for social change nor for
| justice.
|
| The way the system is supposed to work is that engaged
| citizenry actively overhaul unjust laws and apparatuses,
| and the police then enforce those new laws.
|
| Unfortunately we have abysmally low civic engagement in
| most of the western world which leads to the mess we are
| currently in.
|
| I like to make fun of the French as much as anyone else
| but I really respect and admire the French people's
| propensity for protest and to stand up for what they
| believe. That's advanced citizenship in action.
| hilbert42 wrote:
| Right. There'll be more, eventually something truly
| salacious will turn up. When it does keep well away from
| fans.
| joering2 wrote:
| If "something truly salacious will turn up", I would bet
| the source politician will deny it and would want the
| whole thing to be forgotten as quickly as possible, not
| to work on a bill against it to pass it bipartisan,
| because he was an embarrassment case of a leak.
|
| Source: observing last 25 years of US politics.
| ciabattabread wrote:
| I thought you were referring to Australia's hack.
|
| https://amp.theguardian.com/world/2022/nov/11/medical-
| data-h...
| bb88 wrote:
| Remember when the CIA hacked the Senate? No?
|
| https://www.nytimes.com/2014/08/01/world/senate-
| intelligence...
|
| I'm pretty sure a lot of lawmakers still take this shit
| seriously.
| bee_rider wrote:
| The generation currently in charge thinks psychology is a
| dirty word. And even if they would do us all a favor and
| get some help, they probably wouldn't use an app to get it.
|
| By the time millennials and gen z are running the place,
| we'll have moved on to misunderstanding AI or something
| like that.
| actionfromafar wrote:
| Nor do most voters, so it all balances out.
| [deleted]
| crest wrote:
| Does this mean we're finally closer to getting CoreBoot on Intel
| consumer boards from this decade, because everyone can sign a
| valid image?
| fatfingerd wrote:
| The most alarming part of the article is that we are only really
| getting a revocation of these keys because they didn't pay a
| ransom and the ransomers were apparently too stupid to sell them
| secretly instead of releasing them to the public.
|
| As far as we know, if MSI had paid no one would know that Intel
| shipped shared private keys to multiple vendors who could then
| lose them like drunken monkeys.
|
| People ask why these weren't on HSMs.. The article seems to claim
| that they weren't even able to generate the most important ones
| in the correct locations, let alone on HSMs with non-extractable
| settings.
| bawolff wrote:
| Its kind of surprising that such a high value key wasn't split
| into multiple subkeys.
| wmf wrote:
| We've had HSMs for decades and Intel isn't forcing their OEMs
| to use them. This is pretty sad.
| transpute wrote:
| https://twitter.com/matrosov/status/1654930508252581888
| bawolff wrote:
| Sure, but intel is ultimately left holding the bag here not
| the oem, and it was totally within their power to put
| stipulations in the contract around key management.
| userbinator wrote:
| Yes! FUCK "secure" boot! Another win for freedom.
|
| On the other hand, never underestimate the ability of the
| "security" establishment to spin this as a bad thing and instill
| more fear and paranoia.
|
| All these user-hostile in the name of "security" features do is
| take control away from and put it in the hands of some
| centralised entity whose interests certainly do not align
| completely with yours.
| ok123456 wrote:
| This may just be the year of the TempleOS desktop.
| DANmode wrote:
| I bet you can name three superpowers using something similar
| for lots of important things!
| ok123456 wrote:
| they already had keys.
| mesebrec wrote:
| Does this have an effect on SGX and trusted computing? Or only
| secure boot?
| transpute wrote:
| Need to wait for an official statement from vendors, but
| there's a claim about CSME,
| https://twitter.com/_markel___/status/1654625944697556992
| TacticalCoder wrote:
| To all those saying SecureBoot brings absolutely nothing security
| wise...
|
| Why is a project like, say, Debian, even bothering signing
| kernels:
|
| https://wiki.debian.org/SecureBoot
|
| What's their rationale for supporting SecureBoot?
| neilv wrote:
| That wiki page buries what might be the rationale in the "What
| is UEFI Secure Boot?" section:
|
| > _Other Linux distros (Red Hat, Fedora, SUSE, Ubuntu, etc.)
| have had SB working for a while, but Debian was slow in getting
| this working. This meant that on many new computer systems,
| users had to first disable SB to be able to install and use
| Debian. The methods for doing this vary massively from one
| system to another, making this potentially quite difficult for
| users._
|
| > _Starting with Debian version 10 ( "Buster"), Debian included
| working UEFI Secure Boot to make things easier._
|
| Sounds plausible, but I don't know how seriously to take it,
| when that wiki page also includes very generous and
| regurgitated-sounding bits like:
|
| > _UEFI Secure Boot is_ not _an attempt by Microsoft to lock
| Linux out of the PC market here; SB is a security measure to
| protect against malware during early system boot. Microsoft act
| as a Certification Authority (CA) for SB, and they will sign
| programs on behalf of other trusted organisations so that their
| programs will also run. There are certain identification
| requirements that organisations have to meet here, and code has
| to be audited for safety. But these are not too difficult to
| achieve._
|
| I normally look to Debian to be relatively savvy about
| detecting and pushing back against questionable corporate
| maneuvers, but it's not perfectly on top of everything that
| goes on.
| stonogo wrote:
| Can you provide examples of such pushback from Debian? I
| always viewed them as a typically understaffed, underfunded
| volunteer effort without the resources to push back against
| funded technology. I'm ready to be wrong on this, if you can
| help me out!
| neilv wrote:
| For example, Debian putting their foot down on closed
| drivers and (for a long time) downloadable device firmware
| blobs.
|
| I've also seen Debian very responsive when I pointed out
| that a particular package was phoning home before consent
| given.
|
| And one of the notable annoying parts of the Debian
| installer forever is when you think it's started a long
| unattended period of installing packages, but it soon
| pauses to ask you for opt-in to some package usage
| telemetry (so at least they're asking before doing it).
|
| I definitely get the understaffed vibe from Debian, but I'm
| also still pleasantly surprised how well they execute in
| general.
|
| Contrast with a certain commercial derivative -- which
| snoops, installs closed software without the user
| understanding that's that they're doing, pushes an IMHO
| horrible different package manager, is sloppier about
| regressions in security updates, etc.
|
| I wish I had time to volunteer right now to scratch some of
| the itches I have with Debian, and very much appreciate all
| the work that others have done and are doing on it.
| teddyh wrote:
| Debian keeps track of all _remaining_ privacy issues in all
| packages (i.e. such issues which have not yet been
| corrected or patched by the Debian package maintainer):
|
| https://wiki.debian.org/PrivacyIssues
| CircleSpokes wrote:
| Anyone saying secureboot "brings absolutely nothing" clearly
| doesn't understand how secure boot works (or is just arguing in
| bad faith). Secure boot has issues (see key revocation issue &
| vulnerable UEFI program used by malware to install bootkit) but
| it does address a real security issue.
|
| People might not like who holds the commonly preinstalled keys
| (Microsoft and motherboard OEMs) but even then you can add your
| own keys and sign your own images if you want (there was just a
| post yesterday about doing this for raspberry pis),
| realusername wrote:
| The raspberry pi example is an even worse implementation of
| secure boot using an absurd write only once scheme for the
| keys.
|
| That's just creating more ewaste, nobody can ever use that
| device normally again and it cannot be resold.
| CircleSpokes wrote:
| I don't think its absurd at all. It isn't required in
| anyway (opt in), lets you use your own keys (no
| preinstalled microsoft or other bigcorp keys), and isn't
| possible for someone to modify what keys you installed.
|
| Of course if you lose your keys you can't sign anything
| else and that would make it basically ewaste, but most
| things end up as waste when you take actions that are
| reckless and can't be reversed (which is what losing the
| keys would be). Plus tech tends to ends up as ewaste after
| less than a decade anyways. Like sure you could still be
| using an AMD steamroller CPU but realistically after 10
| years you'd be better off using a cheaper more power
| efficient chip anyways.
| realusername wrote:
| > Plus tech tends to ends up as ewaste after less than a
| decade anyways. Like sure you could still be using an AMD
| steamroller CPU but realistically after 10 years you'd be
| better off using a cheaper more power efficient chip
| anyways.
|
| I'm not sure what you are trying to argue but people
| routinely buy used computers on market place. Rasperry
| pies with locked keys are essentially paper weights once
| the owner doesn't want to use them anymore.
|
| And realistically, the biggest ewaste generators are
| especially smartphones nowadays which are too locked to
| be reused well.
| tzs wrote:
| > I'm not sure what you are trying to argue but people
| routinely buy used computers on market place. Rasperry
| pies with locked keys are essentially paper weights once
| the owner doesn't want to use them anymore.
|
| Why can't the owner who wants to sell their locked Pi
| give the buyer the key?
| unglaublich wrote:
| It's what the customer has come to expect.
| cptskippy wrote:
| Doesn't this enable them to be installed on systems with
| Secureboot enabled without having the user muck around in the
| BIOS? Smart for VMs?
| TacticalCoder wrote:
| I can see your point but, geez, that's pretty depressing if
| it's the only reason it's supported!
|
| As a sidenote for having installed Debian with SecureBoot on
| on several systems, I'd say I still had to muck around quite
| some in the BIOS/UEFI. Latest one I scratched my hair for a
| bit was an AMD 3700X on an Asrock mobo where I somehow had to
| turn "CSM" (Compatibility Support Module) _off_ otherwise
| Debian would stubbornly start the non-UEFI (and hence no
| SecureBoot) installer. On my Asus / AMD 7700X things were a
| bit easier but I still had to toggle some SecureBoot setting
| (from "custom" to "default" or the contrary, don't remember).
| All this to say: it's still not totally streamlined and users
| still need to muck around anyway.
| Vogtinator wrote:
| There's another reason but it's even worse: Some
| certifications require that secure boot is enabled.
| roggrat wrote:
| Does this affect MSI motherboards with AMD sockets ?
| fafzv wrote:
| Is this something that can be fixed with a software update or is
| it basically game over?
|
| I've got an MS-7E07 which is in theory not affected, but who
| knows...
| garbagecoder wrote:
| ITT: I have no idea how government works, but I can sound more
| cynical than you can when I make it up.
| libpcap wrote:
| Is the perpetrator nation-sponsored?
| transpute wrote:
| There is a better solution already designed into Intel Boot
| Guard, which avoids the problems of OEM "secrets" and allows an
| owner-defined firmware root of trust. As described in a 2017 MIT
| paper from Victor Costan, Ilia Lebedev and Srinivas Devadas,
| "Secure Processors Part I: Background, Taxonomy for Secure
| Enclaves and Intel SGX Architecture",
| https://www.nowpublishers.com/article/Details/EDA-051
|
| _> The TPM 's measurement can be subverted by an attacker who
| can re-flash the computer's firmware .. the attack .. can be
| defeated by having the initialization microcode hash the
| computer's firmware (specifically, the PEI code in UEFI) and
| communicate the hash to the TPM module. This is marketed as the
| Measured Boot feature of Intel's Boot Guard.
|
| > Sadly, most computer manufacturers use Verified Boot (also
| known as "secure boot") instead of Measured Boot (also known as
| "trusted boot"). Verified Boot means that the processor's
| microcode only boots into PEI firmware that contains a signature
| produced by a key burned into the processor's e-fuses. Verified
| Boot does not impact the measurements stored on the TPM, so it
| does not improve the security._
|
| On a related note, some HP devices have a dedicated security co-
| processor (SureStart) to verify and/or fix system firmware,
| instead of relying on a CPU vendor root-of-trust like Intel
| BootGuard. Since HP's proprietary security co-processor can be
| disabled by a device owner, those HP devices may be amenable to
| OSS firmware like coreboot.
|
| 2017 HP Labs research paper,
| https://ronny.chevalier.io/files/coprocessor-based-behavior-...
|
| _> We apply this approach to detect attacks targeting the System
| Management Mode (SMM), a highly privileged x86 execution mode
| executing firmware code at runtime .. We instrument two open-
| source firmware implementations: EDKII and coreboot. We evaluate
| the ability of our approach to detect state-of-the-art attacks
| and its runtime execution overhead by simulating an x86 system
| coupled with an ARM Cortex A5 co-processor._
|
| 2021 HP marketing whitepaper,
| https://www8.hp.com/h20195/v2/GetPDF.aspx/4AA7-6645ENW.pdf
|
| _> Every time the PC powers on, HP Sure Start automatically
| validates the integrity of the firmware to help ensure that the
| PC is safeguarded from malicious attacks. Once the PC is
| operational, runtime intrusion detection constantly monitors
| memory. In the case of an attack, the PC can self-heal using an
| isolated "golden copy" of the firmware in minutes._
| fatfingerd wrote:
| Measured boot is great in theory.. But it is only really
| practical to determine that your bios haven't changed at all.
| If you are going to trust updates you are ultimately going to
| have to make the same mistake as verified boot, just manually.
| transpute wrote:
| The Measured Boot mode of Intel Boot Guard is about removing
| the need for an Intel/OEM private key and e-fuse to verify
| the initial code.
|
| For OS-specific measured boot of coreboot open-source
| firmware with a reproducible build, there would be a 1:1
| mapping between the measured firmware hash and the coreboot
| source code revision used to generate the firmware.
|
| Separately, the issue of firmware blobs (Intel FSP, AMD
| AGESA) would remain, although AMD OpenSIL is promising to
| reduce those binary blobs by 2026.
| mjg59 wrote:
| The pervasiveness of secure boot has genuinely made things
| difficult for attackers - there'd have been no reason for the
| Black Lotus bootkit to jump through all the hoops it did if it
| weren't for secure boot, and the implementation of UEFI secure
| boot does make it possible to remediate in a way that wouldn't be
| the case without it.
|
| But secure boot at the OS level (in the PC world, at least) is
| basically guaranteed to give users the ability to enable or
| disable it, change the policy to something that uses their own
| keys, and ensure that the system runs the software they want.
| When applied to firmware, that's not the case - if Boot Guard (or
| AMD's equivalent, Platform Secure Boot) is enabled, you don't get
| to replace your firmware with code you control. There's still a
| threat here (we've seen firmware-level attacks for pre-Boot Guard
| systems), but the question is whether the security benefit is
| worth the loss of freedom. I wrote about this a while back
| (https://mjg59.dreamwidth.org/58424.html) but I lean towards
| thinking that in most cases the defaults are bad, and if users
| want to lock themselves into only using vendor firmware that's
| something that users should be able to opt into.
| aborsy wrote:
| why wasn't it in an HSM?
| ex3ndr wrote:
| Because you still would need a backup
| iamtedd wrote:
| Yes, a backup HSM.
| WhereIsTheTruth wrote:
| Something deep inside me telling me that they are preparing "TPM
| 2.0" season 2
|
| "To boot windows 69, you are now required to have a CPU with the
| newest Microsoft Chip, full of backdoors btw, but shhh" :p
| PrimeMcFly wrote:
| There is no reason to use a manufacture key anyway, at least for
| SecureBoot.
|
| Obviously it isn't in everyone's skillset, but if you have the
| means there is nothing preventing you from generating and using
| your own key.
|
| Honestly it seems like a good basic security precaution, not only
| to prevent against leaks like this, but also to counteract any
| backdoors (although kind of a moot point with chipmakers).
| AshamedCaptain wrote:
| If your firmware and its UEFI modules were originally signed by
| these leaked signatures, what are you going to do? You can't
| just un-trust those.
|
| In many BIOSes were you can apparently remove all keys and the
| firmware still loads its own builtin components, that's because
| the manufacturer has put a backdoor so that their own
| components can always be loaded irregardless of Secure Boot
| state. (Otherwise users would be able to brick their own
| machines). MSI, for example, does this. And guess what: with
| these leaked keys, now anyone can claim to be a "manufacturer's
| own component". Secure Boot is useless.
| Arnavion wrote:
| Secure Boot keys are unrelated to the leaked key in question.
| The Boot Guard key is used to verify the firmware itself that
| the CPU executes on boot. What the firmware happens to do
| afterwards, say it's a UEFI firmware that implements Secure
| Boot, is irrelevant to Boot Guard.
| PrimeMcFly wrote:
| Thank you for clarifying, realized that too late after
| commenting.
| dathinab wrote:
| Yesn't
|
| 1. some EFIs are broken in ways that make using private
| platform keys hard or impossible
|
| 2. there are PCIe cards which need option ROMs to be executed
| (most commonly that dedicated GPUs), this ROMs are not always
| but often signed by one of the Microsoft Keys and removing it
| from the trust db will prevent the ROMs from running and lead
| to all kinds of problems, e.g. not having any video and in turn
| not being able to undo to EFI setting/disable secure boot. You
| can make sure the specific ROMs are whitelisted, but then you
| need to be very very careful about e.g. graphics drivers
| updating the GPU firmware and similar. And putting the right
| data in the trust db isn't easy either.
| PrimeMcFly wrote:
| Good points. I was subconsciously focusing on laptops when I
| made my comment.
| jonas-w wrote:
| Is there a way to know if it is safe to enroll my own keys? I
| always wanted to, but always didn't do it, because I often
| read that it can make the system unbootable.
| dathinab wrote:
| I don't understand how such keys can leak?
|
| Hasn't intel heard about locking keys in hardware, e.g. like with
| hardware security key modules similar but faster/flexibler then a
| TPM. Surly one of the main developers of TPM does understand that
| concept.... right? /s
| pmontra wrote:
| People know keys, people eventually leak keys. It always
| happened.
| mynameisvlad wrote:
| The point is that you _can 't_ leak a key from a HSM.
| 29athrowaway wrote:
| Because security is a cost center and business people do not
| give a shit.
| noizejoy wrote:
| It's not just business people. Most humans seem to see
| security (and safety) as a burden, not as something
| enjoyable.
| XorNot wrote:
| If it's a master key you can't run the business risk of losing
| access to it.
| dathinab wrote:
| you don't have that risk
|
| there are more then just one or two ways to not have that
| risk and still have HSK
|
| best many of this solutions scale pretty much to any
| arbitrary company size
| CircleSpokes wrote:
| The keys are OEM specific. Intel gives them to MSI so they can
| sign their firmware/BIOS updates. Clearly MSI didn't handle
| them well.
| emmelaich wrote:
| So this is really an MSI key? Not Intel - even if Intel
| created it.
| CircleSpokes wrote:
| Intel gave it to MSI, but I may have been incorrect before.
| Apparently the keys was shared across multiple OEMs (at
| least that is how I read this below)
|
| >The leaked private keys affect Intel's 11th, 12th, and
| 13th generation processors and were distributed to various
| OEMs, including Intel itself, Lenovo, and Supermicro.
| josephcsible wrote:
| This isn't a blow to real security, just to DRM and treacherous
| computing. There's no legitimate security from "Secure" Boot.
| bawolff wrote:
| Evil maids?
| hlandau wrote:
| Can you explain how Intel Bootguard usefully guards against
| evil maids?
| Filligree wrote:
| How many of us have maids? How many of those maids are evil?
| ghostpepper wrote:
| "Evil maid" is a generic descriptor for any number of
| attacks that can be performed with physical access to a
| device.
|
| https://en.wikipedia.org/wiki/Evil_maid_attack
|
| "The name refers to the scenario where a maid could subvert
| a device left unattended in a hotel room - but the concept
| itself also applies to situations such as a device being
| intercepted while in transit, or taken away temporarily by
| airport or law enforcement personnel. "
| codedokode wrote:
| With physical access you can simply install a keylogger,
| GPS tracker, and maybe something worse (malicious PCI-
| Express or USB device for example).
| guilhas wrote:
| Still, how real of a threat that is for 99% of computer
| users?
|
| And law enforcement will have a device to bypass most
| devices security
| bawolff wrote:
| Its definitely on the high end of attacks and a bit
| unlikely, but i dont think its exclusively nation states.
| Well within the reach of thieves who want to steal your
| bank info or something.
| Avamander wrote:
| > And law enforcement will have a device to bypass most
| devices security
|
| What makes you say that and how is that an excuse to do
| nothing?
| stefan_ wrote:
| So the maid swaps your keyboard instead. There is no winning
| in this scenario, and it certainly isn't through "Secure
| Boot".
| happymellon wrote:
| I agree that this is the reason, but having Intel as the
| guard only makes it so that it could have already been
| hacked/leaked/bypassed and you never know.
|
| At least if it was user controlled we can ensure that other
| people's leaked keys don't bypass our security.
| charcircuit wrote:
| If it's user controlled what stops an attacker from
| bypassing it as the "user"? Most people just want to have a
| secure device and will not think about security, not want
| to do any work to secure their device.
| AshamedCaptain wrote:
| There was this recent article (here in HN) about these "evil
| public charging ports that can hack your smartphone" and how
| there is an entire ecosystem of devices to protect against
| them.... when in practice no one has heard about any one
| single example of such evil charging port, and that in
| practice carrying out such attack is so target-specific and
| leaves so many warnings signs that the entire thing sounds
| implausible to say the least.
|
| These evil maids are even more implausible than that. Has to
| be ridiculously targeted. If you are really targeted by such
| a powerful state-like entity, wouldn't it make much more
| sense for them to just send a NSA letter to Intel (or
| whatever the weakest link in your chain is, and there are
| plenty of extremely weak chains here, like the BIOS
| manufacturer) and/or backdoor the hell out of it?
|
| Secure Boot was never about security for normal users nor
| security for the majority of us. This is like
| https://xkcd.com/1200/ all over again. At the point the
| attacker can write arbitrary bytes to your hard disk, its way
| past the point where the majority of users care.
| its-summertime wrote:
| EM isn't needfully a targeted attack: almost everyone is
| running x86_64
|
| it'd just be a matter of replacing a binary with a iffy'd
| version that runs before any decryption happens, e.g.
| replacing plymouth.
|
| This isn't hard to do in the slightest? I think even you or
| I could do it.
|
| But with secureboot, replacing a binary in the loading
| chain isn't an option.
|
| I don't think I could convince intel to install a bug for
| me.
|
| https://blog.invisiblethings.org/2011/09/07/anti-evil-
| maid.h... is a good descriptor of how it all comes together
| AshamedCaptain wrote:
| All smartphones use ARM and USB and Android, and _even
| then_ the evil USB charging port is targeted -- you still
| have to tailor it to the target's screen ratio, Android
| version, Android UI/skin, even launcher if they have one,
| etc.
|
| > it'd just be a matter of replacing a binary with a
| iffy'd version that runs before any decryption happens,
| e.g. replacing plymouth.
|
| You'd at least need to imitate the UI your target is
| using for unlocking the disk (e.g. plymouth theme). Then,
| after the user types something, either virtualize the
| rest of the boot process (which is already extremely
| implausible), or otherwise reboot in a way that does not
| immediately cause the user to be suspicious. All of this
| is as targeted as it gets. A generic version would get as
| far as your average phishing email.
|
| But... how do you plan to replace my bootloader in the
| first place? You'd need root access for that. At that
| point, it is already game over for the target! Why would
| you need to tamper with the bootloader at that point?
|
| Or are you thinking about breaking into my house and do
| that in my offline computers ? How is that not a
| "targeted attack" ?
| its-summertime wrote:
| adding `store password somewhere` doesn't get in the way
| of plymouth's theming (which is separate), it doesn't
| change the rest of the boot process, etc etc etc etc etc,
| its taking an open source project, adding some lines to
| it, compiling, and swapping a binary out. Why would it
| need to any of this other stuff?
|
| > You'd need root access for that. At that point, it is
| already game over for the target! Why would you need to
| tamper with the bootloader at that point?
|
| Yes that is the crux of the Evil Maid attack, a drive-by
| install of software. e.g. at a coffeeshop while one is on
| the toilet, at an office, at a hotel by an evil maid, etc
| etc. AEM is about detecting changes in trust: if the
| loading sequence is changed, then the verifier (another
| device like a usb dongle) can't verify (since the TPM can
| no longer unlock the prior secret due to the chain
| changing).
|
| You might want to look into the article I linked in my
| earlier comment to get the full idea of what is meant by
| evil maid
| AshamedCaptain wrote:
| > Yes that is the crux of the Evil Maid attack, a drive-
| by install of software. e.g. at a coffeeshop while one is
| on the toilet, at an office, at a hotel by an evil maid,
| etc etc.
|
| If the laptop was left online and unlocked: What do you
| expect to gain by installing a patched plymouth versus
| installing a traditional remote control software and/or
| keylogger ? You don't even need root for the latter!
|
| If the laptop was left locked: do you plan to open the
| laptop, remove the disk, transfer some files to it
| (matching the same distro & version of all components
| your target was using, otherwise the entire thing may
| just crash or look different and reveal the attack), hope
| the target doesn't notice his laptop was literally taken
| apart (most laptops just can't be opened at all, for the
| ones which can, even mine has a simple open-circuit
| tamper detector...), then come back in the future _and do
| the same_ again to recover the captured password? And how
| is this not a ridiculously targeted attack?
|
| Besides, at that point, you could simply install a
| wiretap on they keyboard, an attack which unlike the evil
| maid crap I have seen _millions_ of times in the wild
| (e.g. at public pinpads, card readers at gas stations,
| etc. ).
| sigmoid10 wrote:
| It's not just about evil maids and physical access. Even if
| you did get root level RCE, you did not have access to
| screw with hardware security. With the UEFI keys, you
| suddenly have a whole new level of persistence, meaning
| that if you ever get pwned, you can basically throw your
| hardware in the trash, because even a system level wipe
| will not be a guaranteed way to clean malware.
| AshamedCaptain wrote:
| If your attacker has root, and your system allows
| flashing the BIOS from root (many do), he can simply
| disable Secure Boot, or enroll one extra signature --
| his. If the system doesn't allow flashing a BIOS even if
| an attacker has root access, then Secure Boot makes no
| difference whatsoever.
| sigmoid10 wrote:
| [dead]
| sockaddr wrote:
| > treacherous computing
|
| I like it, going to hang on to this one
| josephcsible wrote:
| Credit goes to Richard Stallman:
| https://www.gnu.org/philosophy/can-you-trust.en.html
___________________________________________________________________
(page generated 2023-05-06 23:00 UTC)