[HN Gopher] iMessage, Apple Music used by NSO Pegasus to attack ...
       ___________________________________________________________________
        
       iMessage, Apple Music used by NSO Pegasus to attack journalist
       iPhones
        
       Author : esens
       Score  : 287 points
       Date   : 2021-07-19 14:28 UTC (8 hours ago)
        
 (HTM) web link (appleinsider.com)
 (TXT) w3m dump (appleinsider.com)
        
       | TravisHusky wrote:
       | I have never heard of MVT (Mobile Verification Toolkit) before
       | this article, but now I may just have to test it out; seems like
       | an interesting project.
        
       | comodore_ wrote:
       | there are apparently 50k names in that list, last I've checked
       | they confirmed ~180 journalists are among them. spying on
       | journalists is atrocious, but who are the other 49.800?
        
         | Ar-Curunir wrote:
         | Opposition Politicians, activists, critics, etc
        
         | Miner49er wrote:
         | They're releasing more names including, "lawyers, human rights
         | defenders, religious figures, academics, businesspeople,
         | diplomats, senior government officials and heads of state"
         | 
         | From: https://www.theguardian.com/news/2021/jul/18/huge-data-
         | leak-...
        
       | wolverine876 wrote:
       | > However, it is unlikely that Pegasus will be a problem for the
       | vast majority of iPhone users. While the tool is used as intended
       | against criminals by governments, the attacks against innocent
       | people are seemingly against those who could be critics to a
       | regime, including journalists and human rights activists.
       | 
       | Attacks against the freedom of others and critics of government
       | are a much larger threat to ordinary people than if they were
       | surveilled themselves.
        
         | Ar-Curunir wrote:
         | One isn't necessarily a larger threat than the other; both are
         | horrible
        
         | twobitshifter wrote:
         | They are governments that will happily track and monitor
         | everyone- maybe they won't use Pegasus but they could use the
         | same vulnerabilities.
        
       | j45 wrote:
       | I wonder if there is a way to disable iMessage and iTunes usage.
       | 
       | With windows server I used to have a target of balance in any
       | attack footprint.. if Microsoft provided the OS, the component
       | services that the server exists to provide should always try to
       | be third party software (db, web server, etc) to try and minimize
       | one type of escalation vulnerabilities... while possibly opening
       | up to another, hopefully less worse set of holes.
        
         | sneak wrote:
         | You can use a NextDNS configuration profile at
         | https://apple.nextdns.io and a NextDNS account to block the
         | device communicating with many Apple services.
         | 
         | A good way to disable iMessage and iTunes, though, is to simply
         | not have an Apple ID. (This prevents the install of
         | applications via the App Store, however.) You can of course set
         | up the device with no Apple ID and then _only_ add the Apple ID
         | to the App Store (and not iTunes or iMessage /FaceTime/iCloud).
         | This is what I do.
        
         | rsync wrote:
         | You can block or restrict these with the (free) tool apple
         | publishes called "apple configurator".
        
       | hugh-avherald wrote:
       | An intelligence agency cannot have the following properties
       | simultaneously:
       | 
       | (1) The ability to detect espionage from China and Russia (2) The
       | inability to access journalists' phones
       | 
       | If you want an intel agency to be able to thwart Chinese
       | intelligence activities, you can't also publicly state you won't
       | be looking closely into members of a profession who act a lot
       | like spies.
        
         | criley2 wrote:
         | We understand that the intelligence agencies can and do monitor
         | a number of people associated with hostile foreign governments.
         | For example, this is believed to be how "Tucker Carlson got
         | surveilled by the CIA" -- he is believed to have contacted a
         | surveilled Russian agent to discuss interviews with the Russian
         | president.
         | 
         | This is called "incidental collection" and it's a touchy
         | subject for sure.
         | 
         | But this subject is different than the DoJ directly surveilling
         | journalists who leak, which is a problem, and governments
         | surveilling their own citizens directly, not incidentally.
         | 
         | We can and should hold our government(s) to a standard of
         | effective fire-walling of acceptable intelligence gathering and
         | holding them accountable when they go beyond to surveil
         | citizens directly, or indirectly through spying agreements.
         | 
         | We can make sure that the people who surveil Chinese or Russian
         | "diplomats" are totally different than the people who execute
         | search warrants against our citizens, and expect there to be
         | zero crossover there.
        
           | cronix wrote:
           | > For example, this is believed to be how "Tucker Carlson got
           | surveilled by the CIA" -- he is believed to have contacted a
           | surveilled Russian agent to discuss interviews with the
           | Russian president.
           | 
           | Yes, that happens all of the time but one difference here
           | with Tucker is he was deliberately "unmasked." Normally when
           | an American is caught up in foreign surveillance, their
           | identity is blocked out or masked, "incidental collection" as
           | you said. Someone purposefully unmasked it. And someone
           | purposefully leaked it. The same thing was done to General
           | Flynn.
           | 
           | https://en.wikipedia.org/wiki/Unmasking_by_U.S._intelligence.
           | ..
        
         | wolverine876 wrote:
         | In fact, in (most/many) advanced democratic countries
         | intelligence agencies can and do exactly that.
        
       | coldcode wrote:
       | Just because it was only used to target journalists, supposedly,
       | does not mean someone could not also target random individuals. I
       | doubt NSO has such control over their customers that the uses
       | can't be expanded to almost anything, like blackmail, theft and
       | harassment.
        
       | 3pt14159 wrote:
       | I dated a journalist once. She used some random free app for
       | phone calls because recording calls isn't built into iOS and she
       | needed to record calls. I suggested a small device for her to
       | plug her headphones through, but she declined.
       | 
       | I'm sure there's a few journalists out there that take
       | cybersecurity seriously, but I'd wager the vast majority are
       | pretty trivially monitored.
        
         | jdavis703 wrote:
         | I did help desk support at a news agency. We were constantly
         | cleaning up malware from journalists computers... The
         | journalists were constantly downloading all sorts of sketchy
         | files as part of their job. Basically, if you're leaking state
         | secrets / embarrassing repressive governments, don't leave a
         | digital trail that can be traced back to you. Just assume
         | everyone (especially journalists on national security or human
         | rights beats) have been hacked.
        
           | pratheekrebala wrote:
           | Yes! In our newsroom (which isn't perfect by any means) - I
           | have been testing using Qubes for really sensitive/untrusted
           | documents. We also open un-trusted documents (from e.g. FOIA
           | responses) on a machine live-booting from a CD.
           | 
           | However, it adds enough friction (especially with remote
           | work) that it's hard to get it right 100% of the time.
           | 
           | If you want to share really sensitive documents, one way to
           | ensure proper handling of your documents is to use a service
           | like SecureDrop [0] which for e.g. only accepts submissions
           | over Tor and requires the use of a secure viewing station [1]
           | (air-gapped machine live-booting Tails w coreboot rom +
           | webcam/networking card physically removed) to decrypt/access
           | leaks.
           | 
           | That being said, I don't think there's a perfect tech-only
           | solution because nothing is stopping folks handling it
           | carelessly after they access the file.
           | 
           | [0] https://securedrop.org/directory/center-public-integrity/
           | 
           | [1] https://docs.securedrop.org/en/stable/set_up_svs.html
        
             | ip_addr wrote:
             | You could also use Dangerzone [0]. It opens a document in
             | two docker containers and converts it into a safe version.
             | It was created by the director of infosec at The Intercept.
             | 
             | [0] https://dangerzone.rocks/
        
         | heavyset_go wrote:
         | > _I dated a journalist once. She used some random free app for
         | phone calls because recording calls isn 't built into iOS and
         | she needed to record calls. I suggested a small device for her
         | to plug her headphones through, but she declined._
         | 
         | Sounds like she dodged a potential honeypot and surveillance
         | attempt.
        
         | swiley wrote:
         | Apple really doesn't help them. the marketing (lying) that iOS
         | is secure is pretty intense.
        
           | andrewzah wrote:
           | Perfectly secure computers are an oxymoron. They don't exist.
           | 
           | iOS is the least worst mobile option and it's ridiculous to
           | say Apple is lying about security if any exploits are found,
           | ever.
           | 
           | If you look at e.g. how messaging works in iOS 14 [0] you'll
           | see that they do in fact work on making secure systems. But
           | parsing and memory safety are hard. Like, really hard. The
           | fact that NSO found exploits doesn't mean Apple is doing
           | anything, but Apple is clearly making it more and more
           | difficult to find and abuse such exploits.
           | 
           | For the average person that isn't being specifically targeted
           | by sophisticated malware from companies funded by
           | -governments-, iOS is pretty damn secure. Dealing with being
           | attacked is a different threat model.
           | 
           | [0]: https://googleprojectzero.blogspot.com/2021/01/a-look-
           | at-ime...
        
             | swiley wrote:
             | >Perfectly secure computers are an oxymoron. They don't
             | exist.
             | 
             | Absolutely, but creating a platform the encourages or
             | forces users to do the wrong thing is a _regression_ from
             | where we were ten years ago.
             | 
             | >iOS is the least worst mobile option
             | 
             | No. Devices running a FOSS operating system like the
             | Pinephone are the least worst mobile option, people don't
             | like it because it's not sexy and it's currently very
             | inconvenient. The rest of the options are so bad that
             | you're probably better off without a mobile phone _at all._
             | 
             | RE: iMessage
             | 
             | You have everyone using exactly the same messaging client,
             | so you have one piece of software to exploit and now you
             | can attack everyone. The extreme lack of diversity makes
             | these sorts of complex exploits much more profitable.
             | 
             | >iOS is pretty damn secure
             | 
             | Sure, if you don't do anything with it. But it encourages
             | users to download unaditable closed apps and reassures them
             | that doing so is totally safe despite the fact that most of
             | them are using 3rd party telemetry services run by data
             | brokers.
        
               | nemothekid wrote:
               | > _No. Devices running a FOSS operating system like the
               | Pinephone are the least worst mobile option, people don
               | 't like it because it's not sexy and it's currently very
               | inconvenient_
               | 
               | Just because it's FOSS doesn't mean it's secure. If your
               | problem is _privacy_ then sure, the PinePhone is the
               | least worst mobile option. If your problem is _security_
               | I don 't see how a phone that doesn't have hardware
               | embedded key manager is a step up. It's not like the
               | Linux Kernel, and whatever messenger you do decide to use
               | is free from zero-days either.
               | 
               | > _But it encourages users to download unaditable closed
               | apps and reassures them that doing so is totally safe
               | despite the fact that most of them are using 3rd party
               | telemetry services run by data brokers._
               | 
               | And for the very same reason your bicycle is safer than a
               | car because it doesn't encourage you to drive 75mph. I
               | agree the world might be a lot better if we "return to
               | monkey" but I don't think anarcho-primitivism is a
               | solution.
        
               | swiley wrote:
               | >Just because it's FOSS doesn't mean it's secure.
               | 
               | Right, but it does mean you won't be forced to do things
               | the wrong way because it makes Apple money.
               | 
               | >hardware embedded key manager
               | 
               | This means keeping copies of keys unencrypted (or
               | encrypted with a key on the same device which is
               | effectively the same) on the device. You're just a couple
               | exploits away from sharing the keys at that point so many
               | people argue that these make things worse and not better.
               | 
               | >It's not like the Linux Kernel, and whatever messenger
               | you do decide to use is free from zero-days either.
               | 
               | Sure but you can't even guess at which messenger I use.
               | Attacking me means taking expensive professional time and
               | focusing it on one person. As for zero days in the
               | kernel, they seem to appear less often than for iOS but I
               | could be missing some.
               | 
               | >anarcho-primitivism
               | 
               | There's nothing more primitive than flinging binary
               | artifacts around the way you do on closed OSes. The FOSS
               | OS approach where knowledgeable people protect those who
               | aren't knowledgeable (without restricting their rights)
               | is a significantly more advanced social structure.
        
               | nemothekid wrote:
               | > _Right, but it does mean you won 't be forced to do
               | things the wrong way because it makes Apple money._
               | 
               | I don't understand this point. What's wrong with
               | downloading binaries from a trusted distributor (Apple)?.
               | If you agree that just because it's FOSS doesn't mean
               | it's secure, then downloading binaries is as "right" as
               | you are going to get when it comes to mobile app
               | distribution. It's no different than downloading binaries
               | from apt.
               | 
               | > _This means keeping copies of keys unencrypted (or
               | encrypted with a key on the same device which is
               | effectively the same) on the device._
               | 
               | No. The whole point of the Secure Enclave means the keys
               | never leaves the hardware - they never touch the main
               | memory and the keys can never be read out of the chip.
               | You are never "a few exploits away" from getting the keys
               | because there is no mechanism to read the keys at all.
               | This also prevents attacks on the device itself - you
               | cannot brute force an iPhone without the Secure Enclave
               | locking you out. I'm not certain (and I really doubt) the
               | PinePhone is resistant to physical attacks.
               | 
               | > _Sure but you can 't even guess at which messenger I
               | use. Attacking me means taking expensive professional
               | time and focusing it on one person._
               | 
               | The article is about journalists who were targeted by a
               | state sponsored cyber security firm. This is a moot
               | point, not to mention security by obscurity doesn't work.
               | 
               | > _The FOSS OS approach where knowledgeable people
               | protect those who aren 't knowledgeable (without
               | restricting their rights) is a significantly more
               | advanced social structure._
               | 
               | Except that, in practice, this is no different (and
               | arguably worse) than just trusting Apple. It turns out
               | knowledgeable people do not work for free, most other
               | knowledgeable people don't read the code or recompile
               | sources, and FOSS maintainers aren't always properly
               | equipped to ship secured software. Heartbleed is poster
               | child for this.
               | 
               | I'm not saying that it's impossible for there to be
               | secure FOSS code, but that it's incredibly difficult to
               | ship secure code _at all_ in any situation. For the non-
               | technical person it 's far easier to trust platform that
               | is hardened from the outset (like the iPhone) that has a
               | well-funded security team (like Apple) and is recommended
               | by other security professionals.
        
               | [deleted]
        
               | ComodoHacker wrote:
               | >so you have one piece of software to exploit and now you
               | can attack everyone. The extreme lack of diversity makes
               | these sorts of complex exploits much more profitable.
               | 
               | The flip side is the lack of diversity makes patching
               | easy. Good luck pushing an update patching a 0-day
               | affecting 3-4 Android versions to 60% of devices.
        
               | swiley wrote:
               | That's why you should be using dynamic linking, something
               | these closed mobile OSes effectively prohibit.
        
               | tablespoon wrote:
               | > No. Devices running a FOSS operating system like the
               | Pinephone are the least worst mobile option, people don't
               | like it because it's not sexy and it's currently very
               | inconvenient. The rest of the options are so bad that
               | you're probably better off without a mobile phone at all.
               | 
               | There's nothing about FOSS that makes something secure,
               | and building secure software is so hard and expensive
               | that my guess is that you need the sponsorship of a
               | government of major corporation to do so. Some FOSS does
               | have such sponsorships, but a lot doesn't.
               | 
               | IIRC I've even heard that OpenBSD, despite its
               | reputation, may no longer more secure than Linux due to
               | Linux's manpower advantage. I don't even have to look up
               | the numbers, but Apple definitely has a major security
               | manpower advantage over the people making the Pinephone.
               | 
               | That's not to put down the Pinephone, but we have to be
               | reasonable about what a project like that is and what is
               | can (and cannot) achieve.
        
               | ascagnel_ wrote:
               | > There's nothing about FOSS that makes something secure,
               | and building secure software is so hard and expensive
               | that my guess is that you needs the sponsorship of a
               | government of major corporation to do so. Some FOSS does
               | have such sponsorships, but a lot doesn't
               | 
               | The F/OSS community has a weird collective amnesia about
               | exploits that rubs me the wrong way -- just because
               | someone can look at it doesn't mean that someone is
               | looking at it, or even that the person looking at it is
               | going to fix it instead of exploit it. Heartbleed was
               | sitting out in the open for 2+ years, despite OpenSSL
               | being a very popular package available under a permissive
               | license.
        
               | tablespoon wrote:
               | > The F/OSS community has a weird collective amnesia
               | about exploits that rubs me the wrong way...
               | 
               | If you repeat something frequently enough, a lot of
               | people will regard it as true. And a lot of people are
               | extremely reluctant to reevaluate their judgements after
               | they've made them, even in light of new information.
               | 
               | IIRC, the "FOSS is more secure" refrain started in the
               | 90s/00s, when security was an afterthought even at
               | companies like Microsoft and Apple and Linux was unusual
               | enough to fly under the radar when there were a lot of
               | big, high-profile worms circulating. But since then
               | _some_ closed-source commercial software has gotten much
               | more secure, and FOSS has gotten more popular, but
               | remains plagued by important projects that get by on
               | shoestring resources.
        
             | deregulateMed wrote:
             | >iOS is the least worst mobile option and it's ridiculous
             | to say Apple is lying about security if any exploits are
             | found, ever.
             | 
             | Speaking of companies lying... You are holding your phone
             | wrong, and your keyboard works fine.
             | 
             | Oh and your apps might have a backdoor, but it took getting
             | sued by Epic for us to let anyone know that.
             | 
             | Apply lying is about as common as a politician lying.
        
             | heavyset_go wrote:
             | iOS exploits are cheaper than Android exploits because iOS
             | exploits are so plentiful[1][2].
             | 
             | [1]
             | https://www.theregister.com/2020/05/14/zerodium_ios_flaws/
             | 
             | [2] http://zerodium.com/program.html
        
             | ekidd wrote:
             | > _But parsing and memory safety are hard. Like, really
             | hard._
             | 
             | This doesn't have to be the case. Start by avoiding C and
             | C++. Use Java (on Android) to write parsers. It is very
             | hard to take a buggy parser written in Java, and to
             | escalate to a memory corruption attack.
             | 
             | If you _really_ can 't use a language like Java, write your
             | parser in safe Rust using slices over Vec<u8>. Then run a
             | fuzzer over it. You'll find a few runtime panics, but
             | you're vanishingly unlikely to encounter memory corruption.
             | 
             | Buffer overflows and memory corruption can be almost
             | entirely avoided these days, at a price.
        
           | j45 wrote:
           | iOS is currently the least worst mobile solution as a daily
           | driver for the majority of people who are users before
           | techies.
           | 
           | It doesn't mean it's good enough but I'd be curious to hear
           | your ideas for what could work as easily for the masses.
        
             | dr-detroit wrote:
             | I would air gap the masses if were really trying to address
             | the root cause of problems
        
           | cunthorpe wrote:
           | To be fair it's probably the most secure environment for the
           | average Joe, you're just saying that it's not _perfectly
           | secure,_ which would be impossible in this world.
        
             | swiley wrote:
             | You could do far better than iOS. Worse though is that it
             | encourages very poor infosec because when it's profitable
             | for Apple and often makes doing things correctly difficult
             | or impossible.
        
               | dr-detroit wrote:
               | Well, I mean there's always systemd. What are we talkin
               | here, Gentoo?
        
               | LucidLynx wrote:
               | I suppose you have examples to propose?
        
               | swiley wrote:
               | It makes checking the hygiene of apps you use impossible,
               | building them from source artificially difficult and
               | expensive and pushes users towards services with serious
               | flaws like icloud backup.
        
               | jonfw wrote:
               | > average joe
               | 
               | > building them from source
               | 
               | An average joe doesn't even know what 'build from source'
               | means
        
               | ajari wrote:
               | We could have taught people such things, but there's no
               | profit in that. We want to maximize the number of people
               | using our devices and our software, so that we get
               | richer, even if it means putting some fraction of these
               | users in grave danger. That's simply negligence. That
               | it's distributed across an entire industry doesn't change
               | the ethics. Selling people tools that put them at risk is
               | much different than sharing foss.
        
               | cunthorpe wrote:
               | Some things you say make sense, but suggesting that
               | "people" can/should learn how to build from source is
               | simply nonsense. Heck if I had to build my OS I'd stick
               | to a feature phone instead.
        
           | nemothekid wrote:
           | > _the marketing (lying) that iOS is secure is pretty
           | intense._
           | 
           | I don't see how it's lying. If you are going to consider that
           | iOS is not secure because they got owned by a couple 0 days,
           | then by that definition there isn't a secure piece of
           | software on the planet.
        
             | slivanes wrote:
             | iOS exploits are paying less than Android -
             | http://zerodium.com/program.html.
             | 
             | Based on supply and demand it would appear that iOS is less
             | secure right?
        
               | LucidLynx wrote:
               | True, and many many good researchers boast the Android
               | security compared to iOS, thanks to Google and Samsung
               | (mostly) since many years now
               | (https://onezero.medium.com/is-android-getting-safer-
               | than-ios...).
               | 
               |  _But_ as a *platform* I am intimately convinced that iOS
               | is far more secure than Android... I agree that a few
               | apps have been authorised by Apple to be published on the
               | App Store, but when it happens to the Play Store it is
               | not only one or two apps... it is mostly 5 to 10 apps
               | developed by the same developer and which contain *the
               | same* flaws.
               | 
               | Also, as demonstrated AdGuard a few years now
               | (https://adguard.com/en/blog/popular-android-apps-are-
               | stealin...), it is way easier to extract user
               | informations from random apps on Android than iOS.
               | However the Android API has been improved since two years
               | now (and Android 12 is better than ever to secure user
               | informations).
        
         | xenocratus wrote:
         | But it also depends on what kind of journalism they're doing,
         | right? Not all report on criminal activity, or on investigating
         | the government. It's kinda like threat-models, no need to be
         | super secure if your work brings no risks to you, your
         | organisation, or those you come in contact with.
        
           | jdavis703 wrote:
           | Journalists from celebrity gossip reporters to foreign
           | affairs correspondents needs to take security seriously. Even
           | gossip journalists receive information from sources that
           | ranges from information that would get the source fired or
           | blacklisted to put in jail (e.g. LA sheriffs leaking
           | celebrity photos).
        
             | certnlyuncertn wrote:
             | How likely is it that people are exploiting zero days
             | against reporters in any of those examples though. That's
             | why threat models are different for different types of
             | journalism.
        
               | reader_mode wrote:
               | Does it take a zero day when you install random freeware
               | crapware from the store ?
        
           | c7DJTLrn wrote:
           | Agreed. The parent comment makes a ridiculous extrapolation.
        
           | wolverine876 wrote:
           | Anything that will change a stock price is enormously
           | valuable, for example.
        
           | Hamuko wrote:
           | You can get to the criminal activity or government
           | investigation journalists through the more "trivial"
           | journalists if they work in the same company.
        
           | newman8r wrote:
           | That's a fair point, although bad actors will also wait
           | around for years for your work to become more
           | interesting/relevant, if they think there's a chance of it.
        
         | pratheekrebala wrote:
         | I see your point, however, having worked in newsrooms - it
         | really is about their beat and their threat-model. My
         | organization covers a wide range of beats and folks covering
         | national security or other sensitive topics have an entirely
         | different workflow compared to those covering, e.g. housing.
         | 
         | I think being responsive to their needs and building trust will
         | go much further. Also, designing a one-size fits all model will
         | just mean that your reporters will either ignore the guidance
         | or find a way to work around it.
         | 
         | For instance, the most recent credible threat we have had
         | against one of our reporters wasn't a state-level actor, but
         | rather folks on the internet (trivially) finding their address
         | and doxing/harassing them and their family. No amount of
         | technology hygiene will change the fact that voter
         | registrations are public records.
        
           | 3pt14159 wrote:
           | Well, she wrote about scary stuff. Murderers, etc. Feature
           | stories for one of the few fact checked Canadian magazines
           | left. Some stuff in The Atlantic about politics.
           | 
           | Was she getting leaks from NSA staffers? No. But it does feel
           | kinda silly to me that journalists, generally speaking, have
           | insecure setups by default. But I get it, it's a hard
           | industry to squeeze a living out of these days.
        
           | wolverine876 wrote:
           | If someone gets access to the housing reporter's systems,
           | that seems a great way to move horizontally or vertically to
           | get access to the other reporter or to the entire
           | organization.
           | 
           | I don't envy your challenge. Security must make it more
           | expensive to the attacker than it's worth. Even the housing
           | reporter's data could be highly valuable; with inside
           | knowledge, someone could make a killing on real estate. The
           | value of the national security beat information is
           | astronomical.
           | 
           | I don't grasp why, with all the news about breaches,
           | reporters still don't care.
        
       | skarz wrote:
       | I like the end of the article, he says "its concerning, but
       | unless you happen to be a major critic of a government, you
       | probably won't be a target of the spyware tool"
       | 
       | Yeah, okay.
        
       | sneak wrote:
       | This coupled along with the fact that iMessage's E2EE has been
       | backdoored by the non-E2EE iCloud Backup key escrow is a good
       | argument for leaving iMessage, FaceTime, and iCloud all turned
       | off on a device.
       | 
       | I go one step further and leave the SIM card out, which means the
       | SMS vulnerability path is closed too.
        
         | gjsman-1000 wrote:
         | But then you are using SMS, which your cell carrier can
         | absolutely see and intercept because it's decrypted.
         | 
         | So in either case... turn off native messaging and use Signal
         | or something if you are paranoid. You aren't really using the
         | "phone" part anymore, so buy an iPod touch or something.
         | 
         | Also, iMessage is fully E2E if you disable iCloud Backup. Which
         | can easily do in Settings.
        
           | sneak wrote:
           | Please stop using the term "paranoid" to describe those who
           | desire personal privacy.
        
             | gjsman-1000 wrote:
             | There is a degree to where you are actually paranoid
             | though, otherwise we wouldn't have that word.
             | 
             | If you are this paranoid, you shouldn't be carrying an
             | electronic device.
        
               | Closi wrote:
               | Paranoia is an _irrational_ suspicion that you are being
               | watched.
               | 
               | If you just don't want to be watched, either by people or
               | algorithms, and have a rational understanding of what
               | tracking/surveillance you are under, and you are actually
               | not paranoid.
        
               | caymanjim wrote:
               | It's not paranoia when it's true. While most people value
               | the convenience of conventional phones calls and default
               | messaging applications over true privacy, those who
               | prefer privacy aren't being paranoid. Companies are
               | monitoring communication to increase ad revenue;
               | government are monitoring communication to catch
               | criminals, enable industrial espionage, and suppress
               | dissent. It's only paranoia if it's delusional. We know
               | that we're being spied on, even if we're not being
               | individually targeted. Even democracies that supposedly
               | value freedom engage in widespread surveillance in direct
               | violation of their own laws.
               | 
               | I'm in the camp of pragmatic resistance to surveillance.
               | I use browser plugins to block ads and cookies where it
               | doesn't get in the way of reaching the content I want; I
               | use Signal for messaging even though almost none of my
               | recipients do; I disable location services except for
               | things like Maps that actually need to know where I am; I
               | turn off all the spyware I know about that's built into
               | operating systems; etc. I'm not a tin-foil-hat-wearer;
               | I'm not doing anything illegal that I need to hide; I'm
               | just trying to push back in a small way against the
               | erosion of privacy and rights that permeates everything
               | electronic.
               | 
               | But the parent isn't paranoid. They really are watching.
               | And we shouldn't be so complacent.
        
               | sneak wrote:
               | > _I disable location services except for things like
               | Maps that actually need to know where I am_
               | 
               | Fun fact: having systemwide location services on, even if
               | you don't enable it for any apps, means that your
               | location is sent in realtime to Apple/Google at all times
               | (via Wi-Fi triangulation data). It's not just passive GPS
               | reception.
               | 
               | If you want actual location privacy, you'll want to leave
               | location services off systemwide on your smartphone, and
               | consider getting an offline GPS receiver device. Good car
               | satnav devices from China are like $60 now, and include
               | continent-wide maps, though you lose realtime traffic
               | info, being offline.
        
               | commoner wrote:
               | There is a way around this. If you use an Android
               | distribution with UnifiedNlp (part of microG) and without
               | Google Play Services, you can install only the location
               | providers that you want to use for Wi-Fi and cell tower
               | triangulation. Google would not be monitoring your
               | location queries. Provider options include:
               | 
               | - OpenCellID (offline):
               | https://f-droid.org/en/packages/org.gfd.gsmlocation/
               | 
               | - Radiocells.org (optionally offline):
               | https://f-droid.org/en/packages/org.openbmap.unifiedNlp/
               | 
               | - Deja Vu (offline cache using Wi-Fi and cellular data): 
               | https://f-droid.org/en/packages/org.fitchfamily.android.d
               | eja...
               | 
               | - Mozilla Location Services (online): https://f-droid.org
               | /en/packages/org.microg.nlp.backend.ichna...
               | 
               | UnifiedNlp is preinstalled on Android distributions that
               | include microG. CalyxOS is the only one of these that
               | supports relocking the bootloader with the developers'
               | key:
               | 
               | https://calyxos.org
        
               | caymanjim wrote:
               | This falls under "close enough" for me. Even with
               | systemwide location off, cell providers and your ISP
               | still know where you are; there's simply no way to stop
               | them from knowing. If I fire up an app and Android gives
               | me a popup saying it won't work with location off, then
               | at least I know which apps are asking for it, and can
               | enable the very few that I want to share that with
               | because I get something out of it (like navigation).
        
               | gjsman-1000 wrote:
               | Yes, but if you are that paranoid and worried about it,
               | the fact remains you should not carry an electronic
               | device.
               | 
               | This person is so paranoid, that they believe that a
               | cyberweapon developed by a private company in Israel that
               | uses previously-unknown bugs in the most sandboxed
               | messaging system you can get on a phone are going to be
               | deployed against them, so they should not use the
               | calling, texting, or any other "phone-like"
               | functionalities of a phone.
               | 
               | They then distrust that the End to End Encryption is in-
               | fact End to End, and then think that using Signal or
               | something is more secure, when if a bug in a system more
               | sandboxed than Signal was found (iMessage, which has
               | BlastDoor which Signal does not have), it is more than
               | likely that Signal has it's own zero-days in it, so you
               | shouldn't be using that either.
               | 
               | That's paranoid, and if you are that paranoid (which,
               | maybe you have a reason to be), your solution isn't well
               | thought-through. You shouldn't be using a phone if you
               | can help it.
        
               | fsflover wrote:
               | You could carry a device with kill switches and only turn
               | them on when you need a connection. See: Librem 5 and
               | Pinephone.
        
               | sneak wrote:
               | No, that's not what paranoid means. Your statement is
               | simply incorrect and your use of the word is derogatory.
        
               | gjsman-1000 wrote:
               | Only if you say so.
               | 
               | There is a degree of rational fear, rational expectation
               | of being tracked. Your degree of fear though is
               | irrational unless you are, in fact, a journalist in an
               | authoritarian state.
               | 
               | You are saying that you are so paranoid, you don't trust
               | iMessage to be End-to-End Encrypted because it has zero-
               | click exploits developed as part of a cyberweapon that is
               | explicitly targeted against high-profile journalists. You
               | then think using Signal or something is more secure, even
               | though if this was pulled off in iMessage (more sandboxed
               | than any other messenger security-wise), your other
               | messengers probably are also flawed and you shouldn't use
               | any of them.
               | 
               | In fact, you shouldn't use a mobile device. And maybe for
               | your situation, that is right and rational. But for most
               | people, it's not.
        
               | Closi wrote:
               | > Your degree of fear though is irrational unless you
               | are, in fact, a journalist in an authoritarian state.
               | 
               | You are putting words in OP's mouth. OP never said he was
               | fearful, only that he didn't want to be tracked.
               | 
               | Someone friendly could follow me around in real life and
               | watch what I'm doing - and keep suggesting products to me
               | based on getting to know me. I'm not going to be afraid
               | but I am going to be freaking annoyed, and feel like my
               | privacy is violated when he says he isn't going away.
        
               | gjsman-1000 wrote:
               | Not wanting to be tracked is fine.
               | 
               | I'm trying to say his game-plan for not being tracked is
               | immensely flawed. He thinks a nation-state weapon could
               | be used against him, so switch to a third-party messenger
               | which doesn't do the same degree of sandboxing for
               | security. What could go wrong?
               | 
               | If you are worried about a threat that is that niche, and
               | will almost certainly be patched soon, you shouldn't be
               | using any messenger, logically speaking.
        
               | Closi wrote:
               | IMO you are putting words in their mouth and
               | misrepresenting what OP was saying.
               | 
               | OP wasn't talking just about the pegasus attack, they
               | were talking about the key escrow not being held under
               | end to end encryption on iCloud. That's not going to be
               | patched any time soon, and there are other messengers
               | which don't do this.
        
               | smoldesu wrote:
               | That's absolutely not paranoia, so I'd suggest you leave
               | GP alone instead of burning karma and making yourself
               | look like a fool.
        
               | gjsman-1000 wrote:
               | Name-calling and blind assertions ("it's not because it's
               | not!") is not a good-faith response.
        
               | Dah00n wrote:
               | No, he is right that you are using bad words because you
               | disagree. I wouldn't have added this but the thread just
               | keeps going.
               | 
               | Just because someone want to be as secure as possible
               | while using their electronic devices and you think they
               | are being extreme doesn't mean that they are being
               | paranoid. It has nothing to do with being paranoid. It
               | could simply be because it is fun to try and secure your
               | devices or to gather knowledge on how to do so in case
               | you need to apply the skill-set at work or a thousand
               | other reasons.
               | 
               | >you don't trust iMessage to be End-to-End Encrypted
               | 
               | I don't secure my devices as GP does but I also do not
               | trust for a second that iMessage is securely E2EE. It is
               | not something you hear rarely if talking about the topic,
               | in fact it is very common argument on HN that iMessage
               | messages are saved unencrypted to iCloud.
               | 
               | >this was pulled off in iMessage (more sandboxed than any
               | other messenger security-wise)
               | 
               | That is almost the opposite opinion of iMessage than what
               | was posted by researchers yesterday on HN (well, Twitter
               | originally). In fact they stated:
               | 
               | >"BlastDoor is a great step, to be sure, but it's pretty
               | lame to just slap sandboxing on iMessage and hope for the
               | best. How about: "don't automatically run extremely
               | complex and buggy parsing on data that strangers push to
               | your phone?!"
               | 
               | In short, Paranoid is misused a lot like this. Just like
               | Schizophrenia (it is often used about having multiple
               | personalities or many opinions that clashes, but neither
               | is correct usage).
        
               | gjsman-1000 wrote:
               | > It could simply be because it is fun to try and secure
               | your devices or to gather knowledge on how to do so
               | 
               | It could be the case, absolutely. But the OP doesn't
               | sound like their having fun, they are in earnest.
               | 
               | > do not trust for a second that iMessage is securely
               | E2EE
               | 
               | Ask a security expert, and they will tell you it has been
               | verified by just about everyone who has inspected it that
               | this is, in fact, the case, including the EFF. But it is
               | proprietary code, not open, which is a downfall.
               | 
               | > are saved unencrypted to iCloud
               | 
               | And can be turned off with the flip of a switch in
               | Settings if that's something you are worried about. For
               | most people who aren't OP-sec (like my Grandma), having
               | all of her messages deleted because someone stole her
               | phone isn't worth it.
               | 
               | > "buggy parsing on data that strangers push to your
               | phone?!"
               | 
               | Yes... Except that _every other secure messenger_ also
               | does the exact same thing. And they don 't have BlastDoor
               | sandboxing like iMessage does. Yes, BlastDoor has flaws,
               | but at least it's there unlike other messengers which
               | don't sandbox.
        
             | meepmorp wrote:
             | It is paranoid for the average person to think they're
             | sufficiently interesting to be a surveillance target.
        
               | heavyset_go wrote:
               | We aren't in the 1970's. It's cheap and easy to do
               | dragnet surveillance, and it costs a fraction of a cent
               | to store text communications and to perform speech-to-
               | text on audio and video.
               | 
               | You don't have to be interesting, you just need to exist
               | to be caught up in the dragnet.
        
               | sneak wrote:
               | Everyone in the global west (and China, and Russia) is
               | subject to mass surveillance. That's documented fact, not
               | paranoia.
        
               | Karunamon wrote:
               | Passive surveillance being a thing means everyone reading
               | this text is a target.
        
           | Dah00n wrote:
           | This
           | 
           | >you are using SMS
           | 
           | doesn't fit with this from GP
           | 
           | >leave the SIM card out
        
           | swiley wrote:
           | No, then you use gpg email or xmpp.
        
         | askonomm wrote:
         | Should probably also dig an underground bunker and collect cans
         | of sardines to last for decades.
        
           | beervirus wrote:
           | Canned food has a surprisingly short shelf life. Sealed
           | containers of dry beans and rice are the way to go.
        
           | abaracadab wrote:
           | Don't forget your pentium and DOS collection! Air-gapped of
           | course.
        
         | msh wrote:
         | Then you could just as well get a iPod touch or iPad mini.
        
           | sneak wrote:
           | Neither of those has a vibrate motor to let me know about
           | notifications. They also can't be used to pair to an Apple
           | Watch.
           | 
           | I know this because I used to carry an iPad Mini in my pants
           | pocket.
        
             | gjsman-1000 wrote:
             | Then buy an iPhone, and turn off iCloud Backup in Settings,
             | it's not hard to do. Then your iMessages are fully E2E
             | Encrypted.
        
               | sneak wrote:
               | Nope, because they get escrowed by the other end of the
               | iMessage conversation.
               | 
               | Also, the whole point of disabling iMessage (in this
               | thread) is to close the iMessage-related zero click
               | exploits described in TFA.
        
               | gjsman-1000 wrote:
               | The other end of the conversation escrows the key on any
               | messenger. Otherwise how would you read the message?
               | Unless you consider Snapchat, but that's not End to End
               | Encrypted.
               | 
               | And are you _really_ sure that Signal or your preferred
               | messengers don 't _also_ have Zero-Click exploits? After
               | all, they aren 't sandboxed to the degree iMessage is
               | with BlastDoor.
        
               | Dah00n wrote:
               | >"BlastDoor is a great step, to be sure, but it's pretty
               | lame to just slap sandboxing on iMessage and hope for the
               | best. How about: "don't automatically run extremely
               | complex and buggy parsing on data that strangers push to
               | your phone?!"
               | 
               | https://twitter.com/billmarczak/status/141680151468579635
               | 2
        
               | gjsman-1000 wrote:
               | Except that _almost every other secure messenger_ is
               | guilty of the same thing. And they don 't sandbox at all,
               | whereas BlastDoor at least tries to.
        
               | sneak wrote:
               | Snapchat claims to be end to end encrypted, last I
               | looked.
               | 
               | Signal does not escrow endpoint keys in an iCloud Backup,
               | so your first statement is incorrect.
        
               | gjsman-1000 wrote:
               | This is false. Snapchat has "snaps" protected, but text
               | messages and group messages are not end to end encrypted.
               | 
               | Also, Signal putting your escrow keys in iCloud? I don't
               | think you know what you are talking about. You can set
               | iMessage to not put your keys in iCloud like I said above
               | by turning off iCloud Backup which makes it fully End-to-
               | End with your own key on your device, just like Signal.
               | 
               | If you are worried about the other party having their
               | conversations being backed up, tell them to disable
               | iCloud Backup. If you are this worried about the privacy
               | of your communications, hopefully the other party would
               | be as well.
               | 
               | And Signal and any other E2E messenger is absolutely
               | storing copies of your key on the recipient's phone, just
               | like iMessage would. If it didn't, there'd be no way to
               | verify that a message was sent from the same sender.
        
               | Dah00n wrote:
               | >You can set iMessage to not put your keys in iCloud like
               | I said above by turning off iCloud Backup which makes it
               | fully End-to-End
               | 
               | "Fully" smells like a weasel word here. Either it is E2EE
               | or it isn't. iMesssage isn't by default from what you are
               | saying and if it requires the other end to also turn off
               | icloud backup before it is E2EE then I'd go as far as
               | stating that it is a completely useless attempt to be
               | E2EE. In fact I'd argue Apple is full of sh*t if they
               | actually ever stared that it is E2EE (but I have no idea
               | if they did).
               | 
               | Comparing Signal to such a mess is... well at a minimum
               | it is disingenuous.
        
               | gjsman-1000 wrote:
               | It's not a weasel word.
               | 
               | The messages are fully end-to-end encrypted, we know
               | that, the EFF has stated as such. However, iCloud Backup
               | means copies of your messages that arrived after the end-
               | to-end process are backed up online. For most people who
               | buy iPhones, having their messages not be permanently
               | lost if their phone is stolen is a fair trade. If you
               | don't want copies of your messages backed up after they
               | arrived through the end-to-end encryption process, then
               | turn it off.
        
               | tialaramex wrote:
               | Signal doesn't need keys for the messages you previously
               | received. You received a message from Jim, it's received,
               | done, no need to retain keys to decrypt the message from
               | Jim.
               | 
               | You might be thinking, "But what about the next message
               | from Jim?" but that message is encrypted with a _new key_
               | so the previous key isn 't useful, your Signal works out
               | what _that_ key will be and remembers it until it
               | receives a message from Jim.
               | 
               | It's a ratchet, you can go forwards but you can't go
               | backwards, if I didn't keep the message Jim sent me last
               | week then even though you've got the encrypted message,
               | and I've still got a working key to receive new messages,
               | we can't work back to decrypt the old message.
               | 
               | You might also be thinking, "There must be a long term
               | identity key so that I can tell Jim and Steve apart?".
               | Indeed there is. But Signal doesn't use this to sign
               | messages since that's a huge security mistake, instead
               | this long term identity key is used to sign a part of the
               | initial keys other parties will use to communicate with
               | you.
               | 
               | This design deliberately means you can't _prove_ to
               | anybody else, who sent you anything or what they sent.
               | Sure, you can _tell_ people. You can dish the dirt to
               | your spouse, your friends, the Secret Police, but you can
               | 't prove any of it cryptographically.
        
               | gjsman-1000 wrote:
               | You are trying to say that iMessage does not have forward
               | secrecy.
               | 
               | That's true, and is a perfectly legitimate reason to use
               | Signal.
               | 
               | I'm saying that the OP was dissing iMessage because of
               | the Pegasus zero-click exploit, and was saying that
               | switching to Signal gives zero guarantees of protecting
               | you from that, because it likely has it's own zero-click
               | exploits, especially because it doesn't attempt to
               | sandbox unlike iMessage does with the flawed BlastDoor.
        
               | [deleted]
        
               | smoldesu wrote:
               | ...or I could just use a truly-secure option that doesn't
               | destroy my personal security model. Owning an iDevice
               | presents a considerable security risk to my current
               | setup.
        
               | gjsman-1000 wrote:
               | There is no such thing as a "truly-secure option." As
               | anyone truly concerned about security will tell you.
               | 
               | You will be forced to make compromises somewhere unless
               | you want to live under a rock in the desert. You can't
               | drive without a State ID, can't get a home loan without
               | credit, can't work without a Social Security Number
               | except under limited circumstances, can't make money
               | without reporting to the IRS, and so on. It's entirely
               | about what compromises you want to make, and the
               | tradeoffs therein.
        
               | sneak wrote:
               | > _can 't work without a Social Security Number except
               | under limited circumstances_
               | 
               | Something like 96% of human beings don't have a social
               | security number. Many of them work.
        
               | gjsman-1000 wrote:
               | Like the nation you live in doesn't have its own Tax
               | Authority with information on you, and doesn't have its
               | own ID Number you need to use for working.
               | 
               | The technicals are different, the point is the same.
        
               | smoldesu wrote:
               | I don't count on a "truly secure" option existing, I just
               | manage my risk by reducing the amount of Big Tech thumbs
               | in my personal pie. Apple, much like Amazon, Microsoft
               | and Facebook, have no right to any of my personal
               | information, end of story.
        
             | nobodylikeme wrote:
             | I just wanna know how big are your pants pockets
        
       | max_ wrote:
       | Time for a cyber security focused smartphone?
        
         | xyst wrote:
         | Simple solution: just use "dumb phones" or burners
         | 
         | No non-open source "smart" phone is going to be secure enough.
         | If you never store your data on your phone, you are safe from
         | these hacks. Now you have to just protect from physical attacks
         | :)
        
           | site-packages1 wrote:
           | CopperheadOS is an open source OS that builds on Android and
           | can be used on the Pixel devices. I've found it to be quite
           | secure.
        
             | gjsman-1000 wrote:
             | Except for physical attacks. No root of trust means that if
             | your phone was ever stolen, installing a PIN guessing app
             | is easy. Extracting the encrypted data for attacking it
             | elsewhere is also easy.
        
             | wolverine876 wrote:
             | Didn't CopperheadOS shut down years ago? The developer,
             | Daniel Micay, now develops GrapheneOS.
        
         | fjtktkgnfnr wrote:
         | Those are always targeted extra hard since they tend to be used
         | by criminals. See the recent "encrypted phones" (Encrochat,
         | Anom, ...)
         | 
         | If you really care about security maybe it's better to get a
         | really dumb 4G phone and share it's connection with a Linux
         | small form tablet (but not running Android).
         | 
         | Of course, inconvenient as hell, but much more secure,
         | especially since you are not running the iOS/Android mono-
         | culture, so for anyone to target you it would require
         | customized service.
        
           | gjsman-1000 wrote:
           | But then you are vulnerable to physical attacks. You don't
           | have hardware root of trust, so installing a PIN-guessing
           | tool is easy. Extracting the encrypted data for attacking it
           | on a computer outside the phone is also easy.
        
             | jonfw wrote:
             | So don't use a PIN?
        
           | Ar-Curunir wrote:
           | I don't think ootb Linux is more secure than Android or iOS.
           | You don't even have simple sandboxing between apps.
        
         | pomian wrote:
         | I agree. To break apart from the Android/Apple world, surely a
         | team of people could disrupt the ecosystem. It wasn't that long
         | ago that flip phones were state of the art. Somewhere in
         | between then and now, we passed all the barriers to lose
         | privacy. BlackBerry, blackphone didn't succeed to be
         | profitable, but perhaps that was not the right time. Perhaps
         | privacy was not so completely lost yet, to be relevant to the
         | public. Perhaps there is enough of a market to sustain that
         | model?
        
         | dpkonofa wrote:
         | That's a little silly. The iPhone is a "cyber security focused
         | smartphone" and Apple has billions in R&D money going into its
         | phone. That's a nice thing to say but it doesn't really mean
         | much unless you have some way to achieve that in a way that
         | Apple's vast resources can't.
        
           | Dah00n wrote:
           | The Iphone have never been a "cyber security focused
           | smartphone" unless you define security being in focus while
           | it is at least a few steps down from profit, design, and
           | usability.
        
           | sydd wrote:
           | The silly thing is that Apple advertises their phone as
           | something cyber security focused, when it can be totally
           | pwned in so many ways.
           | 
           | And you don't need Apple's resources to make something
           | better, just a more secure phone would have much worse UX.
           | Just some examples for a much more secure phone, where you
           | dont need Apple's budget:
           | 
           | - Runs some barebones Linux with minimal packages. An SMS app
           | is an SMS app, not something that makes HTTP requests.
           | 
           | - app store is very heavily vetted
           | 
           | - forced updates, you can't dismiss update notifications.
           | 
           | - minimal attack interface, no smart connection features or
           | accessories.
        
             | EveYoung wrote:
             | In that case, you would still need to trust the mostly
             | proprietary drivers and hardware. And if you aggressively
             | remove features, I guess the question becomes why you would
             | even need a phone. Maybe for some use cases it would be
             | better to simply use a laptop.
        
             | gjsman-1000 wrote:
             | You just pwned yourself.
             | 
             | - Forced Updates? The FBI takes over the update server,
             | forcibly sends out an update that sends all messages to the
             | FBI immediately, and there's no way to stop it. That
             | suggestion is idiotic. Or even better, install Pegasus on
             | all the phones, have them be quietly reporting back to home
             | for a few weeks, with journalists having no way to prevent
             | updating.
             | 
             | - You forgot Hardware Root of Trust and Secure Enclave,
             | like on an iPhone. Otherwise, the FBI can install a tool
             | which just guesses PINs over and over while resetting the
             | PIN attempts counter. It is not possible to build this
             | protection in software only. You need chip-level hardware,
             | and only iPhones in Fall 2020 and later have the Enclave
             | set up to block repeated PIN attempts even if Apple-signed
             | code is loaded. No other phone is safe from their own
             | manufacturer like that.
        
           | ska wrote:
           | > ave some way to achieve that in a way that Apple's vast
           | resources can't.
           | 
           | I think "can't" here runs up against "choose not to". So far
           | as we can tell opsec tends to be a pain in the ass in ways
           | that are fundamental, not a problem with tools. Apple, like
           | any other consumer focused company, doesn't lose focus of
           | this.
        
         | fsflover wrote:
         | Here you go: https://puri.sm/products/librem-5
        
         | esens wrote:
         | Would it actually have more resources that say Apple? I think
         | if Apple can not do it, I am unsure if anyone else could. All
         | supposedly secure smart phones are not, but they are at least
         | obscure.
         | 
         | I think that one should probably buy an Apple (at least they
         | control everything rather than the cobbled together android
         | clones) and disable basically everything except exactly what is
         | needed. At least that reduces the surface area. And keep
         | personal stuff on a separate phone.
        
           | smoldesu wrote:
           | Plenty of people have "beaten" Apple for security, though
           | oftentimes these hardened phone OSes are only secure through
           | obscurity.
        
             | wolverine876 wrote:
             | Who?
        
               | gjsman-1000 wrote:
               | The parent post is saying that many of these "secure
               | phones" are, on paper, secure - but that's because
               | companies like the NSO Group don't give them much
               | attention. If they did become the focus of attention,
               | they'd probably burst from a thousand leaks.
        
           | simion314 wrote:
           | iOS seems the worst solution, like you are forced to used
           | Apple web engine so a bug or zero day in that engine will own
           | all users. Apple would need to give the users the ability to
           | uninstall preinstalled stuff and replaced them with safer or
           | better alternatives.
        
           | Dah00n wrote:
           | Apple _can_ do it (create a security focused phone), it just
           | isn 't anywhere near what they _want_ to do. The instant
           | security (or privacy for that matter) gets in the way of
           | profit for Apple they will back away.
        
             | gjsman-1000 wrote:
             | Or maybe it's because they're doing their best to make
             | every iPhone the security-focused phone, while not doing
             | anything that would anger the FBI enough to try to pass
             | legislation. When you are that big of a company, the things
             | you can get away with are much more restricted than a small
             | company.
        
               | redprince wrote:
               | They have already angered the FBI quite a lot during the
               | 2016 San Bernadino case and made their position on the
               | matter clear:
               | 
               | https://www.apple.com/customer-letter/
        
             | redprince wrote:
             | Apple is actually not in the business of selling the data
             | of their users. They will also risk aggravating large
             | players in favor of improved privacy. A recent example: App
             | Tracking Transparency [1] which makes tracking an opt-in
             | feature to be requested from the user. To no one's surprise
             | users are happily declining when made this offer. Companies
             | like Facebook aren't too happy about it. [2]
             | 
             | [1] https://www.apple.com/newsroom/2021/01/data-privacy-
             | day-at-a...
             | 
             | [2] https://www.inc.com/jason-aten/apples-privacy-update-
             | is-turn...
        
               | jensensbutton wrote:
               | Privacy and security are related, but distinct. Apple has
               | been pushing privacy, but we're talking about security
               | here. Typically the tradeoffs around increasing security
               | have to do with user experience, something Apple
               | typically does not like to compromise on.
        
             | cabbagehead wrote:
             | Agreed, Apple has made a commercial decision here not to
             | win against NSO-level adversaries. NSO is clearly winning
             | the war (on the available evidence), presumably by
             | investing more in security research / buying exploits.
             | Sure, they put in *enough* effort to be secure in general -
             | they prefer to keep outrageous profit margins rather than
             | do more. Apple is perfectly comfortable with this balance:
             | 
             | > "Attacks like the ones described are highly
             | sophisticated, cost millions of dollars to develop, often
             | have a short shelf life, and are used to target specific
             | individuals," it said. "While that means they are not a
             | threat to the overwhelming majority of our users, we
             | continue to work tirelessly to defend all our customers,
             | and we are constantly adding new protections for their
             | devices and data."
             | https://www.theguardian.com/news/2021/jul/19/how-does-
             | apple-...
        
         | CA0DA wrote:
         | https://grapheneos.org ?
        
           | atatatat wrote:
           | Yeah, GrapheneOS needs a hardware manufacturing partner.
           | 
           | Anyone who thinks they're up to the task (bulletproof?)
           | should contact them.
        
           | [deleted]
        
         | twobitshifter wrote:
         | Blackberry tried this
         | https://www.blackberry.com/us/en/products/secure-smartphones
        
         | wolverine876 wrote:
         | These apparently exist in the criminal underworld (see the
         | FBI's recent sting using such a project) and for state security
         | organizations (developed by major defense contractors, afaik).
        
         | [deleted]
        
       | nonameiguess wrote:
       | Apple needs to make it possible for users to choose other ways of
       | sending and receiving messages and listening to music, or of
       | choosing not to do either of those things if they don't want to.
       | Obviously, you can currently install and use other applications
       | that provide the same functionality, but you cannot uninstall or
       | disable defaults.
       | 
       | The most shocking experience to me in trying to evaluate the Mac
       | ecosystem when they released the M1 and I bought a Macbook Air is
       | being in meetings where I'm using bluetooth headphones, take the
       | headphones off and put them back on, and music.app automatically
       | opens and comes to the foreground of my desktop. There is no
       | supported way of disabling this user-hostile anti-feature. I look
       | on Google and StackOverflow and all of the suggestions for how to
       | disable it dating back to 2014 or whenever no longer work.
       | Apparently, the likely answer is turn off System Integrity
       | Projection, reboot, rename or remove the file containing the
       | application launcher, turn SIP back on, and hope that doesn't
       | break anything else and hope Apple doesn't revert your changes on
       | the next system update.
       | 
       | That did not seem worth it. The fact that Apple Music can and has
       | been used as an attack vector makes it even worse that it is so
       | tightly integrated with the audio subsystem of the hardware as to
       | take over your device thanks to movements you are making in the
       | physical real world even when you may not be touching the device
       | at all.
       | 
       | I just can't understand what the thought process was in making
       | this a default behavior, let alone one that cannot be disabled.
        
         | warunsl wrote:
         | Happens on my non M1 Mac with Sony XM4s. I am pretty sure it is
         | the headphones sending the play command to the computer.
         | Apparently there is a setting in the Sony headphones app to
         | disable this. But this did not work for me. Music.app still
         | opens up everytime I remove the headphones and put them back
         | on.
        
         | marcellus23 wrote:
         | I think that might just be a bug. Or maybe something in your
         | headphones is causing it to send a "play" command through
         | Bluetooth? That will open the Music app if you have nothing
         | playing already.
        
           | berdario wrote:
           | Given that the headphones cannot know if there's an app
           | playing already, this should be configurable in the OS: i.e.
           | allow selecting which app (or no one) to launch when
           | receiving a Play command
           | 
           | Only allowing their own app to be associated with the default
           | audio player is anti-competitive, at the very least
        
             | marcellus23 wrote:
             | It should be configurable (it is, but only through
             | Terminal), but it's also such a minor problem that I can't
             | blame Apple for not wanting to create the API for third
             | parties, design and build the UI, and then document,
             | support, and maintain all of it for years to come. You have
             | to pick your battles as a developer and being an OS dev is
             | no different.
        
         | Anechoic wrote:
         | > , or of choosing not to do either of those things if they
         | don't want to. Obviously, you can currently install and use
         | other applications that provide the same functionality, but you
         | cannot uninstall or disable defaults.
         | 
         | With Apple Configurator you can disable Music and Messages.
         | It's not the most user-friendly method, but it is possible.
        
         | 6gvONxR4sf7o wrote:
         | I also have bluetooth headphones I use with a mac, and that's
         | never happened to me. Is it a new thing with the M1 machines or
         | something?
        
           | bikezen wrote:
           | It happens to me every time I connect my QC35s to a 2018
           | MacBook Pro. It's extremely annoying.
        
             | wl wrote:
             | Doesn't happen on my 2018 MBP with my QC35 IIs.
        
           | danieldk wrote:
           | Haven't seen this either. Both my wife and I are on M1
           | MacBooks.
        
           | angulardragon03 wrote:
           | Nope, I've never had this happen to me on my M1 machine,
           | Bluetooth or 3,5mm headphones. If you don't have another
           | media app focussed or active, pressing a media control
           | button/key will open Music iirc.
        
         | gumby wrote:
         | > I just can't understand what the thought process was in
         | making this a default behavior, let alone one that cannot be
         | disabled.
         | 
         | I do not get the bluetooth-automatically-starts Apple Music
         | behavior.
         | 
         | I haven't tried but I just checked the iMessages preferences
         | and you can disable being contacted via your phone number or
         | email addresses, with check boxes for each. As Macs don't have
         | phone numbers I think this would work? I _do_ use apple
         | messages (which is why I didn 't try disabling it), but use
         | WhatsApp and signal more than I use the default.
         | 
         | I have no idea how good the mac's security might be, just
         | pointing out my experience.
         | 
         | I agree that Apple could do better with eliminating their
         | bundled apps, but I use third party calendar, address book,
         | reminders, photo, etc with no issues. And I hear quite a few
         | people are willing to use chrome (ugh) as their default browser
         | and safari doesn't get in the way.
        
         | nemothekid wrote:
         | > _where I 'm using bluetooth headphones, take the headphones
         | off and put them back on, and music.app automatically opens and
         | comes to the foreground of my desktop_
         | 
         | I think your bluetooth headphones are sending a play command to
         | your device when it's connected. I'm sure it's annoying, but I
         | think your macbook is doing the right thing here.
        
           | nacs wrote:
           | > your bluetooth headphones are sending a play command to
           | your device
           | 
           | Yep this is what's happening. I have a car bluetooth addon
           | that I purchased that does the same thing -- it sends Play
           | commands to the phone on-connect repeatedly until something
           | starts playing.
           | 
           | By default the phone will open Apple Music but if I already
           | had been playing music on the Spotify app, it'll just start
           | playing that instead.
        
           | kitsunesoba wrote:
           | I'm almost positive that this is what's happening. With a
           | pair of Sony XM3's and various Apple devices I've never seen
           | this behavior.
        
         | jackson1442 wrote:
         | Your headphones are probably sending a play command to your
         | computer. There should be some software you can grab that
         | captures that command and routes it to the application of your
         | choosing, or disables it entirely.
         | 
         | Unfortunately it's not built in, but I think it's your
         | headphones doing something nonstandard because my Sony XM4s and
         | AirPods do not fire this behavior when I put them in.
        
           | nonameiguess wrote:
           | I'm using XM4s and have never done anything to change
           | whatever their own default behavior is. And this doesn't
           | happen on any other device except the Macbook.
           | 
           | I guess it's worth looking into to see if there is some
           | outside of the OS way to force the OS to route requests to
           | the application I already have opened and foregrounded that
           | plays sound, but I would expect _that_ to be the default
           | behavior. What is a  "play" request to music.app even
           | supposed to do when I have never intentionally opened the app
           | and don't have a playlist set up? It doesn't actually play
           | anything since there is nothing to play. It just opens the
           | app and takes over my screen.
        
             | wil421 wrote:
             | This is pretty easily fixed and took 5 seconds to google a
             | solution. I have like 4 or 5 pairs of various headphones
             | form AirPods to Jabra devices and they all do this when I
             | take them off or put them on. The Bluetooth settings all
             | have options to turn it off.
             | 
             | >If you remove the headphones or put them back on, this
             | will pause or resume playback. If you're not wearing the
             | headphones, make sure there's nothing else around the
             | sensor because it may activate and resume playback.[1]
             | 
             | [1]https://www.sony.com/electronics/support/articles/002293
             | 24
        
       ___________________________________________________________________
       (page generated 2021-07-19 23:02 UTC)