[HN Gopher] Apple can read your iMessages (even though they're E...
       ___________________________________________________________________
        
       Apple can read your iMessages (even though they're E2E encrypted)
        
       Author : decrypt
       Score  : 173 points
       Date   : 2021-08-28 19:00 UTC (4 hours ago)
        
 (HTM) web link (old.reddit.com)
 (TXT) w3m dump (old.reddit.com)
        
       | ummonk wrote:
       | Without reading the post I assume it's talking about iCloud
       | backup (which is on by default) backing up your raw messages with
       | just an Apple encryption key? That's well documented and makes
       | sense as a default functionality - average users would be too
       | prone to losing their data if data weren't backed up without
       | E2EE.
        
         | WA wrote:
         | Why not read the post? Because you are technically wrong.
         | 
         | It's about iCloud backups containing the decryption keys.
         | iMessages are backed up encrypted.
        
           | ummonk wrote:
           | Ah right. Caveat though that that applies if Messages is set
           | to use iCloud. In that case iCloud backup backs up the
           | decryption key to ensure you don't inadvertently lose access
           | to the messages because they're end to end encrypted.
           | 
           | If Messages is not toggled in iCloud, then as long as iCloud
           | backup is toggled, it will directly back up the raw messages
           | - encrypting with an Apple stored key just as for any other
           | non end to end encrypted data.
        
           | [deleted]
        
         | jackjeff wrote:
         | Yes. iCloud backups are the easiest way to get the messages in
         | clear text.
         | 
         | But honestly you don't need it. Even though iMessage is end to
         | end encrypted, Apple mediates the key exchange. It'd be trivial
         | for them to do a man in the middle attack by saying the other
         | guy has a new key.
        
         | nostromo wrote:
         | Encrypted backups should be a user option at the very least.
        
         | [deleted]
        
       | marto1 wrote:
       | Aren't they required by law to be able to do that ? (PATRIOT act,
       | etc.)
        
         | upbeat_general wrote:
         | See Signal or a variety of other services. The answer is no.
        
       | smoldesu wrote:
       | Edward Snowden's article earlier this week posited that some 80%
       | of iPhone users leave auto-sync on for iCloud, meaning that
       | there's about a 20% chance that the next thing you send over
       | iMessage isn't encrypted.
       | 
       | Why is guesswork like that acceptable in a _privacy_ tool?
       | Furthermore, who actually believed that Apple _couldn 't_ read
       | their messages? 'End-to-end' means very little when both ends are
       | Apple-controlled.
        
         | ec109685 wrote:
         | iCloud Message sync doesn't expose your messages to Apple. Only
         | setting that matters is the iCloud Backup.
        
         | AbjectFailure wrote:
         | iMessage actually isn't included in iCloud backups by default.
         | It's the one thing toggled off in the settings of a fresh iOS
         | install.
        
           | ummonk wrote:
           | It's included in iCloud backups, just not toggled to use end
           | to end encrypted iCloud sync (in which case iCloud backups
           | would back up the decryption key).
           | 
           | See https://support.apple.com/en-us/HT207428,
           | https://support.apple.com/en-us/HT208532,
           | https://support.apple.com/guide/security/security-of-
           | icloud-..., and https://support.apple.com/en-us/HT202303
        
         | zepto wrote:
         | I'd be surprised if it's as low as 80%.
         | 
         | It's pretty simple. iMessage is relatively secure and most
         | criminals, nation states etc, won't be able to access your
         | messages unless they have a legal means to do so.
         | 
         | If you need protection from a nation state that _can_ force
         | Apple to divulge content, such as the US, use something else
         | such as signal.
        
       | jpxw wrote:
       | Yet another reason to disable iCloud, if you're privacy
       | conscious.
       | 
       | Although you're relying on your recipient disabling it too. So
       | really you have to use something else. Signal, etc.
       | 
       | With that said, I still think an iPhone with iCloud disabled is
       | better than other phones on the market privacy-wise. And for the
       | average consumer, iPhones offer a good tradeoff between privacy
       | and usability.
        
       | sparker72678 wrote:
       | Also worth keeping in mind, this is true for any message you send
       | that's received by someone else, regardless of your own hygiene.
       | 
       | i.e. for true security _all_ message participants must have
       | iCloud Backoff off, etc.
        
       | whoknowswhat11 wrote:
       | A better headline - users willing to give up privacy for
       | convenience.
       | 
       | Reality - there was a period where the icloud backups created
       | backups that apple did not have access to. Critically, this mean
       | that if you had any of a wide variety of things happen - unless
       | you were very good about key management - your content was lost
       | for good. ALL your photos (which could be heartbreaking) etc.
       | 
       | It turns out this is NOT what people want. They want apple to
       | have access to their content, so when they have a device stolen
       | and don't have a super long recovery key properly saved, they are
       | not hosed.
       | 
       | Same issue BTW with bitlocker on windows. People DO NOT save
       | those recovery keys, even if they should. Microsoft added a way
       | to force a backup into an account admins and others would have
       | access to, thank goodness, because otherwise users there would be
       | hosed as well.
        
         | gxs wrote:
         | This is an insightful comment.
         | 
         | Whenever someone loses their account the first thing they do is
         | run to the vendor and flip their shit when the vendor is unable
         | to do anything.
         | 
         | I imagine from a business perspective it's just better to keep
         | access to data, sometimes without nefarious reasons like
         | advertising, rather just keeping customers happy.
        
           | PhantomGremlin wrote:
           | _Whenever someone loses their account the first thing they do
           | is run to the vendor and flip their shit when the vendor is
           | unable to do anything._
           | 
           | They certainly turn to Apple for help, but I think it's
           | sometimes simply despair and not flipping their shit.
           | 
           | E.g. I was once in an Apple store and the Genius was trying
           | to help someone who, for some inexplicable reason, did
           | everything in the Guest account of her laptop.
           | 
           | The laptop rebooted for some reason (maybe she said OK to
           | install updates?), and she lost the contents of Guest. Which
           | means she lost everything.
           | 
           | Vendors like Apple need to be prepared for all sorts of data
           | loss scenarios. Ordinary users simply don't think about
           | computers the same way as engineers do.
        
           | threatofrain wrote:
           | A bit of a tangent, but I happened to flip my shit when my
           | folks laptop was stolen, and the thief placed a password
           | protection on it that Apple refused to lift, despite Apple
           | having the technical capability to do so.
           | 
           | They had records of my folks purchasing the laptop, but they
           | argued that state ID's and credit cards can be faked and
           | stolen, and were thus inadequate for their internal security
           | purposes.
        
             | judge2020 wrote:
             | Your post is very light on details. Did they just place a
             | new password on the device itself? If so, you can still use
             | the laptop if you reinstall MacOS. It is only is rendered a
             | brick if the attacker signed into their own iCloud account
             | and enabled Find My, OR the attacker enabled Find My &
             | enabled the recovery key feature on the existing Apple ID
             | (meaning Apple can't let you log into the account without
             | the key).
        
               | threatofrain wrote:
               | This was on OS X with one of the older touchbar Macbook
               | Pro models, but there's some kind of password protection
               | which can prompt you for a password before the usual
               | visuals of the OS booting up.
        
               | threeseed wrote:
               | Pretty sure this is FileVault full disk encryption.
               | 
               | And so you can still use the computer just not keep your
               | data.
        
               | Wowfunhappy wrote:
               | No, it's also possible to set a firmware password, which
               | pops up if you try to boot from any disk other than the
               | startup disk. If you also can't log into any user
               | accounts on the startup disk, there's no way in!
        
             | threeseed wrote:
             | Apple is right here though.
             | 
             | Those cards are easily forged and there is nothing you can
             | realistically do to prove the laptop is yours without also
             | allowing thieves/criminals to prove it's theirs. And so
             | they have to err on the side of caution.
             | 
             | The onus lies on you to keep backups.
        
               | threatofrain wrote:
               | You can't backup a laptop, only your data. The laptop
               | costs a lot. What Apple was "protecting" is the hardware
               | value of the laptop, not any data.
               | 
               | Also, Apple eventually unlocked the laptop.
        
               | threeseed wrote:
               | But you can always boot into Recovery mode and wipe the
               | disk.
               | 
               | In this case you are wanting to keep the data as well.
        
               | threatofrain wrote:
               | There was no interest to keep the data, only the
               | hardware, which Apple initially refused to unlock. I
               | wanted my parents to be able to do this via Apple
               | support, and not by themselves.
        
           | dylan604 wrote:
           | >I imagine from a business perspective
           | 
           | Regardless of the reasoning, if you have this ability, you
           | can and will be compelled to give up that information.
           | Therefore, it is best to not have that information or STFU
           | about being concerned about privacy.
        
         | upbeat_general wrote:
         | You're conflating the convenience of backing up vs being able
         | to access data even with lost recovery keys.
         | 
         | People are certainly choosing convenience over privacy in that
         | any backup is preferable to no backup.
         | 
         | However, Apple does not give the option to choose E2E or not so
         | users cannot make that choice of privacy vs convenience. You're
         | assuming that people choosing iCloud over not-iCloud means
         | people do not want E2E. It may be the case that people prefer
         | E2E but this is not evidence of it.
         | 
         | Furthermore, I don't believe iCloud backups were ever E2E and
         | that it was a feature that was removed. Apple states which
         | iCloud services are E2E[0] and these services I'd argue are
         | pretty straightforward to setup and not lose. Most of the time
         | you don't need a recovery key, just to be logged in on another
         | device.
         | 
         | Even if it is the case that most users do not want E2E, there's
         | certainly a large audience (outside of just the HN bubble) that
         | do want it. They could make it optional.
         | 
         | [0]: https://support.apple.com/en-us/HT202303
        
         | amelius wrote:
         | They should at least offer users the possibility to get true
         | privacy if they so wish. Blaming the existence of this backdoor
         | on the behavior of the average user is not correct. What's
         | next, selling my data to the highest bidder because the average
         | user doesn't care?
         | 
         | Also, calling this "E2EE" is against the unwritten rules of the
         | profession.
        
           | judge2020 wrote:
           | They added back the 'recovery key' feature in iOS 14[0] but
           | I'm not sure if it includes full encryption of backups or
           | other iCloud data. I couldn't find anything on recovery keys
           | in the Platform Security index either[1].
           | 
           | 0: https://www.macworld.com/article/234693/apple-id-adds-
           | recove...
           | 
           | 1: https://support.apple.com/guide/security/welcome/web
        
           | ummonk wrote:
           | That option is there, and very simple. You just turn off
           | iCloud backup to get true privacy.
        
             | hash872 wrote:
             | No, because if the party you're messaging has iCloud backup
             | on (a good chance), or is on Android, or has their device
             | physically accessed somehow- a third party can read all of
             | the messages that you sent them. There is no 'true privacy'
        
               | JadeNB wrote:
               | > No, because if the party you're messaging has iCloud
               | backup on (a good chance), or is on Android, or has their
               | device physically accessed somehow- a third party can
               | read all of the messages that you sent them. There is no
               | 'true privacy'
               | 
               | This is surely true no matter how strong the encryption
               | scheme, though. If you share your content with anyone
               | else, then your security is only as strong as theirs.
        
               | threeseed wrote:
               | Messages are not stored in iCloud Backup by default.
               | 
               | You have to explicitly enable it.
        
               | ummonk wrote:
               | They are actually. iCloud backup stores the messages by
               | default. If you go and set Messages to use iCloud sync,
               | then they're synced end to end encrypted. Apple then
               | removes the messages from the iCloud backup, and instead
               | stores a backup of the end to end encryption key for
               | Messages. If you want your messages to be truly
               | inaccessible to Apple / agencies with warrants (and also
               | unrecoverable after losing end to end encryption access),
               | then you have to turn off iCloud backup.
               | 
               | Apple's documentation needs to be a little clearer on
               | this but the system is basically designed with the
               | accurate notion that the biggest footgun for 90% of users
               | is not "the government was able to access my messages
               | with a warrant" but rather "I lost access to my end to
               | end encrypted messages".
        
             | upbeat_general wrote:
             | Not sure if you're referring to iCloud [device] backup or
             | all iCloud services.
             | 
             | If it's the latter then yes, you can (almost) always choose
             | not to use a service or website. That's a poor argument to
             | defend bad privacy practices.
             | 
             | If it's the former then just disabling iCloud backup does
             | not result in E2E storage of other iCloud data. iCloud
             | photos, drive, etc. are still not E2E.
        
           | threeseed wrote:
           | a) No one is blaming the user.
           | 
           | b) End to end encryption has always meant transport
           | encryption not encryption at rest.
        
             | JadeNB wrote:
             | > a) No one is blaming the user.
             | 
             | The user is being used as an excuse: "we compromise _your_
             | privacy because that user over there wants it ". This is
             | not blame in the obvious sense of "it's that user's fault
             | that they lost their data", but it _is_ blame:  "the
             | responsibility for this thing isn't ours, the people who
             | wrote the code, but yours."
        
             | amarshall wrote:
             | > End to end encryption has always meant transport
             | encryption not encryption at rest.
             | 
             | Sure, but if someone in the middle wants to store, then
             | they don't have anything unencrypted to store. Ultimately
             | it all depends what one defines the "ends" as. In general
             | for backup, both ends are the user. For chat, the sender
             | and recipient. Apple is not an end, but a provider in the
             | middle.
        
             | spsful wrote:
             | OP is quite literally blaming the user.
        
             | threatofrain wrote:
             | I mean, I'm blaming the user. What's wrong with that in
             | this context? Similarly, if your region hates spicy food
             | and thus spicy is dying in your area, I'd be blaming the
             | region too.
             | 
             | If you think the word "blame" is too morally hot, then we
             | can always go with causal attribution.
        
             | amelius wrote:
             | a) this feature supposedly exists for the average user who
             | wants to have their data back if they forget their
             | password. Apple points at the user and says they are the
             | reason for having this backdoor.
             | 
             | b) this is just false; whether the data is at rest or not
             | has nothing to do with the definition of the term. And it
             | is misleading in any case.
        
               | threeseed wrote:
               | a) It's not a backdoor but an explicit design tradeoff.
               | It's hypocritical to be calling people out for using
               | terms correctly and then misuse the term backdoor.
               | 
               | b) The etymology of the term has been blurred by WhatsApp
               | referring to it as both transport and encryption at rest.
               | But historically it has not included the latter part.
        
               | amelius wrote:
               | > It's hypocritical to be calling people out for using
               | terms correctly and then misuse the term backdoor.
               | 
               | No, it's not because it is another instance. Someone can
               | have one thing right and another thing wrong. And
               | whatever the true meaning of "E2EE", its use in this case
               | is misleading at best, so I'm pretty sure that the
               | profession does not approve of it.
        
         | akomtu wrote:
         | That's essentially a "Switzerland bank" from the movies that
         | keeps a box on your behalf and doesn't ask silly questions. So
         | I wonder if there's an unfilled niche of such "data storage
         | bank" where you give them a blob of data, they store it for you
         | and you trust them not to look inside. This could be also a
         | useful application of those NFTs: an NFT contains a key and by
         | nature of NFTs, the key has one owner at a time and the key is
         | also transferrable.
        
         | phreack wrote:
         | I once saw the kind of support emails a friend got while
         | working CS for an encrypted media storage service. Every single
         | day, someone new was demanding, many times threatening legal
         | action, that the company recovered their encrypted media. It
         | was set up so only the user had the keys, and there was
         | literally no way for the company to recover data without them -
         | something that was a particular selling point of the service,
         | constantly made explicit in many different ways and places. I
         | wouldn't have believed that things were this bad had I not seen
         | it first hand.
        
       | nimbius wrote:
       | Patiently awaiting the obligatory HN 'iPhone considered harmful'
       | thread at this point with complementary link to a medium article.
       | Seriously though after the San Bernardino shooter fiasco and the
       | ongoing us government regulation demands it was basically all but
       | guaranteed apple would pull all the stops to get Sam off their
       | back.
        
       | [deleted]
        
       | orastor wrote:
       | Apple can also silently create a stealthy virtual device that
       | will get all messages as your phone does
        
         | Gaelan wrote:
         | Unless you know something about the protocol that I don't
         | (quite possible), it's not silent: all you devices get "new
         | device registered to iMessage" dialog box.
        
       | ec109685 wrote:
       | This post is wrong. iCloud Backup is the only setting that
       | matters. Whether you enable iCloud Messages or not has no baring
       | on whether Apple can read your messages. With iCloud Message
       | sync, Apple doesn't store a decryption key on their servers.
        
       | warning26 wrote:
       | If they can do this, surely this evaporates any security-related
       | rationale for not providing a web-accessible version of
       | iMessages.
       | 
       | If they just added _that_ , it would be so incredibly useful. I'm
       | sure they won't though, because that might mean that people could
       | access iMessages from non-Apple hardware (the HORROR).
        
       | timmit wrote:
       | These two toggle are funny.
       | 
       | - back my encrypted data - back my encryption key (if back
       | encryption key, the e2e does not make any sense)
       | 
       | What the encryption key will be used to encrypt the e2e
       | encryption key?
        
       | nostromo wrote:
       | ... if you back up your device to iCloud. (Of course, almost
       | everyone does.)
       | 
       | Apple was apparently going to close this loophole, but decided
       | not to. They probably received negative feedback from the three
       | letter acronym agencies.
        
         | AshamedCaptain wrote:
         | Apple can do anything they want. They can push a silent update
         | to specific iPhones which uploads the data in whichever format
         | they prefer.
        
         | [deleted]
        
         | Fizzadar wrote:
         | Even disabled, if the other party has them enabled Apple still
         | have access to that content.
        
         | lynndotpy wrote:
         | Or, as discussed there, if the person you're talking to backs
         | up iMessage to iCloud. Both parties need to be willing to
         | forego that convenience, and AFAIK, it can not be done on a
         | per-message basis.
        
           | ummonk wrote:
           | There are limitations there though. E.g. if a broad warrant
           | to reconstruct all your messages were issued, Apple would
           | refuse to honor it, if only due to technical infeasibility.
           | It's one thing to execute a warrant for "data of suspect A"
           | and another thing to execute a warrant for "access the iCloud
           | backups of every single person who has it enabled, decrypt
           | their backed up messages, and return any message that was
           | received from or sent to suspect A".
           | 
           | On the other hand the government could certainly find a
           | specific individual they know you've messaged with, and
           | execute a warrant specifically for their conversations with
           | you.
        
             | upbeat_general wrote:
             | I believe they have metadata indicating who a person has
             | sent messages to, so they wouldn't need to go through _all_
             | iCloud backups but indeed a warrant for data for  "suspect
             | A" does not include data for all people "suspect A"
             | communicated with.
        
         | [deleted]
        
       | makach wrote:
       | E2E encryption != Encryption at rest.
        
       | MaxBarraclough wrote:
       | Similar discussion 9 days ago on the thread _Apple urged to drop
       | plans to scan iMessages, images for sex abuse_ :
       | https://news.ycombinator.com/item?id=28233200
       | 
       | Perhaps we need a new term, other than _E2E encrypted_ , to close
       | the door on 'loopholes' such as the provider managing your keys.
        
         | rhn_mk1 wrote:
         | Loopholes already make E2EE impossible. The "end" of encryption
         | is not the entry point of your device, it's you: the user (more
         | specifically, as close as possible the place where the data is
         | displayed/entered). If there is some additional stage
         | before/after encryption, then it's not end-to-end.
        
           | zepto wrote:
           | There is _always_ some additional stage after the encryption.
           | 
           | It's silly not to call it E2E, since e2e just means that no
           | eavesdropper in the middle can intercept it.
           | 
           | If you don't trust the person who wrote the software you are
           | using, that is a different issue. Just as serious, but it has
           | nothing to do with either it is E2E or not.
        
         | INTPenis wrote:
         | No, let's just use common sense. You can't trust anything to be
         | e2ee if it's not open source too.
         | 
         | In this case Apple can do anything they want with the keys
         | since it's a locked down and closed platform.
        
           | yawaworht1978 wrote:
           | This is the only real sensible approach. Even with open
           | source signal, people say they still might be able to see
           | things. Is there no way to test this?
        
           | Sebb767 wrote:
           | > You can't trust anything to be e2ee if it's not open source
           | too.
           | 
           | You can't trust anything unless you built it yourself [0].
           | Just because I tell you that binary is built from the source
           | code does not mean it's not backdoored.
           | 
           | [0] Theoretically, reproducible builds provide the same
           | security, but in the end you need to build it yourself to get
           | the hash, at which point you just replace one `cp` with two
           | `sha256sum`.
        
             | heavyset_go wrote:
             | With open systems and code, you can verify if code is doing
             | the things you expect it to, and if you're paranoid enough,
             | you can choose to build everything from the ground up. If
             | there are enough security conscious people out there for
             | there to be a market for security-focused systems, then
             | companies can come in and offer verified systems and
             | reproducible builds of things those people care about.
             | 
             | With closed systems, the best you can hope for is to treat
             | it like a black box that can and will change out from under
             | you.
        
               | judge2020 wrote:
               | having a 3p come in and perform a build doesn't ensure
               | there's not an attack targeted at you, though. You have
               | no idea if yours is different unless you add PKI & CT
               | logging to the mix (which wouldn't be horrible).
        
               | heavyset_go wrote:
               | My point is that with open systems, you have vastly more
               | options depending on how paranoid you are and who your
               | adversary is compared to closed systems.
        
               | threeseed wrote:
               | We can play this game all day.
               | 
               | 99.999% of open source projects rely on Github Actions,
               | Circle CI, Travis CI etc to build their code. Those are
               | all proprietary so unless you are running your own CI/CD
               | stack then you can't trust the code.
               | 
               | And of course every open source project relies on
               | libraries. So you need to make sure this applies equally
               | to them as well.
        
             | dmurray wrote:
             | Trusting a million eyes is acceptable for most people's
             | threat models. I trust (Apple AND a community of people who
             | can tell me the hash of the build) much more than I trust
             | Apple alone.
             | 
             | It's like if you're trying to figure out how a magician
             | does a trick, "he must have a collaborator in the audience"
             | is a reasonable guess. "Everyone in the audience other than
             | me is a collaborator" is usually not.
        
             | JadeNB wrote:
             | > You can't trust anything unless you built it yourself
             | [0].
             | 
             | As "Reflections on trusting trust" (https://www.cs.cmu.edu/
             | ~rdriley/487/papers/Thompson_1984_Ref...) points out, you
             | can't trust anything _including_ if you built it yourself,
             | at least assuming you mean  'build' in the software sense.
             | Even in the hardware sense, it's probably beyond the reach
             | of anyone, let alone any individual actor, to build a
             | modern computing machine from bare metal in a way that
             | involves no trust of a third party.
        
             | [deleted]
        
             | api wrote:
             | Nope. Not there yet. You also have to audit the source and
             | make sure someone hasn't slipped in a back door or
             | intentionally introduced bug.
        
               | ummonk wrote:
               | And of course have to audit the hardware design, and then
               | audit the manufacturing and supply chain for that
               | hardware to ensure someone hasn't substituted hardware
               | with spyware installed.
        
       | giantrobot wrote:
       | I'm continually surprised that people can't seem to understand
       | E2EE. For whatever reason they assume it means a message is
       | encrypted forever and unreadable by anyone.
       | 
       | There is zero guarantee from _any_ E2EE system that the data is
       | encrypted at rest by the sender and receiver. In fact in most
       | cases, the data is _not_ encrypted at rest because people want to
       | do silly things like read messages.
       | 
       | The exact same vulnerability exists on every platform that's
       | automatically backing up local data to _the cloud_. Even if _you_
       | disable cloud backups you 're still stuck if whoever you're
       | messaging has left them enabled.
       | 
       | The only meaningful way around this hole when it comes to
       | messaging apps is row-level encryption on the backing store. This
       | has a lot of problems of its own and potential holes when it
       | comes to indexing and searching.
        
       | TameAntelope wrote:
       | I find this acceptable. My threat model includes pickpockets and
       | nosy siblings. It doesn't include nation states and highly
       | sophisticated attacks.
       | 
       | If the government wants to look at my data, and has gone through
       | the proper channels to do so, I believe that, generally, that
       | system will protect me from a consequential privacy intrusion.
       | It's not a perfect system, but I believe the benefits of the
       | power of subpoena are worth the costs, so I'm happy to
       | participate in it.
        
       | sschueller wrote:
       | These Apple privacy ads [1] are not aging well and they aren't
       | even old.
       | 
       | [1] https://youtu.be/lHcf9ZkJ28o
        
         | threeseed wrote:
         | iCloud Backup not being encrypted is not new and necessary.
         | 
         | Apple's ads will age fine since they invest far more effort in
         | keeping your data private than every other OEM. Even their CSAM
         | effort which blew up is far better than other companies who are
         | doing server-side scanning.
        
           | franczesko wrote:
           | > they invest far more effort in keeping your data private
           | than every other OEM.
           | 
           | Especially in China... Apple's privacy claim sounds silly,
           | knowing that they gave up on protecting users data where it
           | truly matters - against the government abuse and focus mostly
           | on the least dangerous threat (if threat at all) - online
           | advertising.
        
         | smoldesu wrote:
         | My personal favorite is the 'what happens on your iPhone, stays
         | on your iPhone' one.
        
           | zepto wrote:
           | That hasn't changed.
        
             | kitkat_new wrote:
             | in being not true?
             | https://arstechnica.com/gadgets/2021/03/android-
             | sends-20x-mo...
        
               | zepto wrote:
               | Weird - that link has no obvious relevance.
        
             | smoldesu wrote:
             | Well, now it's being hashed and logged by Apple, by
             | default.
        
               | threeseed wrote:
               | Only if you use iCloud Photo Library i.e. you choose to
               | have your photos hosted by Apple.
               | 
               | In which case Apple is no different to every other cloud
               | company which scans for CSAM.
        
       ___________________________________________________________________
       (page generated 2021-08-28 23:01 UTC)