[HN Gopher] Apple's plan to "think different" about encryption o...
       ___________________________________________________________________
        
       Apple's plan to "think different" about encryption opens a backdoor
       to your life
        
       Author : bbatsell
       Score  : 2127 points
       Date   : 2021-08-05 20:20 UTC (1 days ago)
        
 (HTM) web link (www.eff.org)
 (TXT) w3m dump (www.eff.org)
        
       | n_io wrote:
       | This is exactly the event that I've been preparing for. I figured
       | out long ago that it's not a matter of if, but when, Apple fully
       | embraces the surveillance economy. This seems to be a strong step
       | in that direction. As dependant as I've been on the Apple
       | ecosystem, I've been actively adopting open source solutions in
       | place of the Apple incumbents so that when I have to fully pull
       | the plug, I can at least soften the blow.
       | 
       | In place of Mail: Tutanota In place of iMessage: Signal And so
       | on...
        
       | chinchilla2020 wrote:
       | Child abusers are dumb, but smart enough to know not to upload
       | pictures to the cloud.
       | 
       | If was a conspiracy type, I would assume this is more likely to
       | be apple responding to an NSA request to de-crypt data.
       | 
       | This idea will be gradually expanded:
       | 
       | 1. To detect child abuse (unsuccessfully)
       | 
       | 2. To detect terrorism (also unsuccessfully)
       | 
       | 3. To detect criminal activity (successful only against low-level
       | criminals)
       | 
       | 4. To detect radical political views as defined by Apple
       | corporation
       | 
       | 5. To detect human behaviors that are not supported by Apple's
       | corporate vision
        
         | dragonwriter wrote:
         | > Child abusers are dumb, but smart enough to know not to
         | upload pictures to the cloud.
         | 
         | No, they aren't, categorically. That's why they keep getting
         | caught that way.
        
           | chinchilla2020 wrote:
           | Very few get caught that way. Most of the major cases involve
           | seizures of offline hardrives.
        
             | dragonwriter wrote:
             | > Very few get caught that way.
             | 
             | Maybe, I haven't seen any numbers. (I've seen several cases
             | from email is or cloud providers IDing specific content and
             | tipping off law enforcement, but not aggregate stats.)
             | 
             | > Most of the major cases involve seizures of offline
             | hardrives.
             | 
             | Are most cases major cases? Are even most of the
             | individuals caught caught in major cases (I doubt it; the
             | number of publicized major caelses and the number claimed
             | caught in each, and the total number of cases don't seem to
             | line up with that.)
             | 
             | And even for the major cases, how do they get the initial
             | leads that they work back to?
        
       | tw600040 wrote:
       | I wish there existed some decentralized device that can do iCloud
       | backups and people can just buy that divide and set it up in
       | their home.
        
       | hmwhy wrote:
       | And, in the meantime, Roblox is promoted in the App Store.
       | 
       | For context, see https://news.ycombinator.com/item?id=20620102
        
       | gcanyon wrote:
       | Are child porn viewers actually going to use iCloud backup? That
       | seems like even the stupidest person would know not to do that.
       | 
       | So I'll propose an alternative theory: Apple is doing this not to
       | actually catch any child pornographers, but to ensure that any CP
       | won't actually reach their servers. Less public good, more self-
       | serving.
        
       | lovelyviking wrote:
       | - Apple: Dear User, We are going to install Spyware Engine in
       | your device.
       | 
       | - User: Are you out of your f... mind?
       | 
       | - Apple: It's for children protection.
       | 
       | - User: Ah, ok, no problem, please install spyware and do later
       | whatever you wish and forget about any privacy, the very basis of
       | rights, freedom and democracy.
       | 
       | This is by the way how Russia started to filter the web from
       | political opponents. All necessary controls were put in place
       | under the same slogan: "to protect children"
       | 
       | Yeah, right.
       | 
       | Are modern people that naive and dumb and can't think 2 steps
       | forward? Is that's why it's happening?
       | 
       | Edit: Those people would still need to explain how living in
       | society without privacy, freedom and democracy with authoritarian
       | practices when those children will grow up will make them any
       | 'safer' ...
        
       | mcone wrote:
       | I wish there was a privacytools.io for hardware. I've been an
       | iPhone user since the beginning but now I'm interested in
       | alternatives. Last I checked, PinePhone was still being actively
       | developed. Are there any decent phones that strike a balance
       | between privacy and usability?
        
         | teddyh wrote:
         | The Librem 51 is both more powerful then the PinePhone, and is
         | slated2 to get RYF certification3 from the FSF.
         | 
         | 1. https://puri.sm/products/librem-5/
         | 
         | 2. https://puri.sm/posts/librem-5-update-shipping-estimates-
         | and...
         | 
         | 3. https://ryf.fsf.org/
        
           | system2 wrote:
           | I was so excited after reading the specs but damn $899 is a
           | lot.
        
         | Knighttime wrote:
         | There are tons of devices compatible with LineageOS. I suggest
         | taking a look there. https://lineageos.org/
        
           | kivlad wrote:
           | I'd go a step further and recommend https://grapheneos.org/
           | with a Pixel phone.
        
             | Knighttime wrote:
             | That too! It's restricted to Pixel devices though, and (I'm
             | not 100% sure on this. It at least doesn't include it.)
             | doesn't support things like MicroG which is a must for
             | getting some apps that rely on Play Services to work
             | correctly. I really think Graphene is only good for
             | hardcore privacy and security enthusiasts, or for
             | situations that actually require the security. I guess it
             | just depends on how much convenience you want to sacrifice.
        
               | Dracophoenix wrote:
               | CalyxOS is another option. It's hardened but also has
               | MicroG installed.
        
             | josh_today wrote:
             | Serious question- how can anyone know these operating
             | systems are truly secure? Is there a way to test the source
             | code? From a code perspective could Google have placed a
             | back door in Android to access these forks?
        
               | fragileone wrote:
               | You can compile it from the source code yourself if you
               | want. Realistically speaking there may be a backdoor in
               | closed-source Google Play Services, but not in the open-
               | source AOSP project.
        
       | nicetryguy wrote:
       | I'm looking forward to this platform being expanded to facially
       | ID against more databases such as criminals, political
       | dissenters, or anyone with an undesirable opinion so that SWAT
       | teams can barge into the homes of false positive identifications
       | to murder them and their dogs.
        
       | Clubber wrote:
       | What are some options for phones that don't spy on me or my
       | children?
        
         | whycombagator wrote:
         | Android device (pixel for example) and graphene or lineage OS.
        
       | suizi wrote:
       | The FBI doesn't even have the resources to review all the reports
       | they _do_ get (we learned that in 2019), and yet they want to
       | intrude on everyone 's rights to get even more to investigate
       | (which they won't).
        
       | haskaalo wrote:
       | At this point, I think phones can be compared to a home in terms
       | of privacy.
       | 
       | In your house, you might have private documents, do some things
       | you don't want other people to have or see just like what we have
       | on our phones nowadays.
       | 
       | The analogy I'm trying to make is that if suddenly the government
       | decided to install cameras in every houses with the premise to
       | make sure no pedophile is abusing a child and that the cameras
       | never send data unless the AI done locally detects it is
       | something that I believe would shock everyone.
        
         | mackrevinack wrote:
         | its a good analogy that's useful for a lot of things. if
         | someone was standing across the road from your house with a
         | telescope, writing down every tv show or movie you watched, i
         | think most people would be very angry about that. but when
         | people hear they are being profiled online in the same way they
         | are not bothered at all.
         | 
         | it doesn't help that most things online are very abstract, with
         | terms like 'the cloud' making things even harder to understand,
         | which in reality is just someone else's computer
        
         | fsflover wrote:
         | https://news.ycombinator.com/item?id=24463347
        
         | decebalus1 wrote:
         | > At this point, I think phones can be compared to a home in
         | terms of privacy.
         | 
         | unfortunately the law hasn't really kept up with technology.
         | Let's hope this gets in front of a judge who's able to
         | extrapolate some 'digital' rights from the (outdated)
         | constitution. Unless of course they also 'think of the
         | children'.
        
       | barrkel wrote:
       | Once this tech is implemented, courts will direct it to be used
       | in situations Apple did not intend. Apple will have created a
       | capability and the courts will interpret refusal to expand its
       | use as contempt.
        
       | christkv wrote:
       | Why don't they just run their trained classifier on the phone
       | itself to do this stuff. There should not be any need to do this
       | on the server no matter what they say.
        
       | [deleted]
        
       | babesh wrote:
       | Apple is part of the power structure of the US. That means that
       | it has a hand in shaping the agenda for the US but with that
       | power comes the responsibility to carry out the agenda.
       | 
       | This also means that it is shielded from attack by the power
       | structure. That is the bargain that the tech industry has struck.
       | 
       | The agenda is always towards increasing power for the power
       | structure. One form of power is information. That means that
       | Apple is inexorably drawn towards increasing surveillance. Also,
       | Apple's massive customer base both domestic and overseas is a
       | juicy surveillance target.
        
         | babesh wrote:
         | And if you don't believe me, ask yourself who holds the keys to
         | iCloud data for both foreign and domestic customers. Ask Apple
         | if it has ever provided data for a foreign customer to the US
         | government. What do you think GDPR is for?
         | 
         | Hint: it isn't end to end encrypted, Apple doesn't need your
         | password to read the information, and you will never know
         | 
         | Who the frack would design a system that way and why?
        
         | babesh wrote:
         | The die was cast with the 2020 elections when Apple decided get
         | into the fray. Much of tech also got into the fray. Once they
         | openly decided to use their power, they couldn't get back out.
        
       | farmerstan wrote:
       | Police routinely get drug sniffing dogs to give false positives
       | so that they are allowed to search a vehicle.
       | 
       | How do we know Apple or the FBI don't do this? If they want to
       | search someone's phone all they need to do is enter a hash of a
       | photo they know is on the targets phone and voila, instant
       | access.
       | 
       | Also, how is this not a violation of the 14th amendment? I know
       | Apple isn't part of the government but they are basically acting
       | as a defacto agent of the police by scanning for crimes. Using
       | child porn as a completely transparent excuse to start scanning
       | all our material for anything they want makes me very angry.
        
         | rajacombinator wrote:
         | > How do we know Apple or the FBI don't do this? You can almost
         | certainly know that they will do this. Or just text the target
         | a bad photo.
        
         | anonuser123456 wrote:
         | > How do we know Apple or the FBI don't do this?
         | 
         | Because it requires Apple and law enforcement, two separate
         | organizations, to collude against you.
         | 
         | The false positive would have to be affirmed to a court and
         | entered into evidence. If the false positive we're found to not
         | match the true image by the court, any warrant etc. would be
         | found invalid and the fruit of any search etc would be invalid
         | as well.
         | 
         | Apple is a private company. By agreeing to use iCloud photos
         | you agree to their terms, this no 14th amendment violation.
        
           | zionic wrote:
           | Wow, you have a lot of faith in our legal system that is >90%
           | plea deals.
        
             | anonuser123456 wrote:
             | I do not have faith in our legal system. I have faith that,
             | if Apple wanted to build a system to help frame its users,
             | it could do so at anytime and it certainly wouldn't
             | advertise it.
             | 
             | LE only gets involved when Apple makes a determination that
             | 1) you have a high number of hash collisions for
             | exploitative material and 2) those images correspond to
             | actual exploitative material.
             | 
             | So if you are innocent, Apple would have to make the
             | decision to screw you over and frame you. And... they would
             | have to manufacture evidence against you since
             | investigators need actual evidence to get warrants. Hash
             | collisions are not evidence. They can do this today if they
             | want. Apple can simply add illegal content to your iCloud
             | drive and then report you to LE. But they don't seem to be
             | doing that.
        
               | Cipater wrote:
               | Did Apple advertise this feature or was it leaked?
        
           | decebalus1 wrote:
           | > Because it requires Apple and law enforcement, two separate
           | organizations, to collude against you.
           | 
           | Does it really? As I understand it, the thing is pretty one-
           | sided. Who manages and governs the collection of 'hashes'? If
           | it's law enforcement there's no collusion needed. Also,
           | someone can just text you such a photo, or some 0-day
           | exploiting malware (of which governments have a bunch) would
           | plant one on your phone.
           | 
           | > The false positive would have to be affirmed to a court and
           | entered into evidence. If the false positive we're found to
           | not match the true image by the court, any warrant etc. would
           | be found invalid and the fruit of any search etc would be
           | invalid as well.
           | 
           | All of this would happen after you're arrested, labeled a
           | pedo and have your life turned upside down. All of which can
           | be used to coerce a suspect into becoming an informant, plead
           | guilty to some unrelated charge or whatever. This type of
           | thing opens the door to a whole new world of abuse.
        
       | jason2323 wrote:
       | Whats the alternative here? What other viable alternative
       | operating system will we use?
        
         | fragileone wrote:
         | Android ROMs like GrapheneOS for now, mobile Linux distros in
         | the near future.
        
       | rotbart wrote:
       | As a former 13year old, that would be the end of 13 year olds
       | using iMessages... I smell an opportunity.
        
       | beebeepka wrote:
       | My fellow Earthicans, we enjoy so much freedom, it's almost
       | sickening. We're free to chose which hand our sex-monitoring chip
       | is implanted in.
        
       | goatse-4-this wrote:
       | Where's that knob tptacek telling us why we're all paranoid
       | simpletons for not using Apple?
        
       | gowld wrote:
       | I get the concern, but "Corporation X can be compromised by the
       | State, which is evil" is not a problem with the corporation. It's
       | a problem with your civilization.
       | 
       | If you don't trust the rule of law, Apple can't fix that for you.
        
       | _robbywashere wrote:
       | This is waaaay too turnkey for searching images on our devices
       | that someone/something doesn't like. Absolutely terrifying.
        
       | strictnein wrote:
       | This is an excellent example of how far off the rails the EFF has
       | gone. This is completely false:
       | 
       | > "Apple is planning to build a backdoor into its data storage
       | system and its messaging system"
        
         | Kaytaro wrote:
         | How so? That's literally what it is.
        
           | shuckles wrote:
           | None of the announcements describe an iMessage back door,
           | even if you're being extremely generous about what back door
           | means.
        
       | tango-unchained wrote:
       | Advancement of the surveillance state is especially terrifying
       | after this past summer of police abuse. We already know that in
       | our country people in power abuse their authority and nothing
       | happens (unless international protests prompt an action). This
       | just collects more power under the disgusting guise of "won't
       | somebody think of the children" while calling the people opposed
       | pedophile supporters.
       | 
       | Does anybody have recommendations on what to do to help oppose
       | this instead of just feeling helpless?
        
         | fragileone wrote:
         | Vote with your wallet and don't give Apple another cent.
        
       | thedream wrote:
       | The Cult Of The Apple hawks its slimy surveillance Snake Oil to a
       | gluttonous throng of thralls.
       | 
       | So where's the news?
        
       | shrimpx wrote:
       | From Apple's original text[0]:
       | 
       | > Apple's method of detecting known CSAM is designed with user
       | privacy in mind. Instead of scanning images in the cloud, the
       | system performs on-device matching [...]
       | 
       | It's incredible that Apple arrived at the conclusion that client-
       | side scanning that you cannot prevent is _more private_ than
       | cloud-scanning.
       | 
       | Since they claim they're only scanning iCloud content, why not
       | scan in the cloud?
       | 
       | They decided the most private way is to scan iCloud content
       | before it's uploaded to the cloud... Because if they scanned in
       | the cloud it would be seen as a breach of privacy and is bad
       | optics for a privacy-focused company? But scanning on the
       | physical device that they have described as "personal" and
       | "intimate" has better optics? That's amazing.
       | 
       | This decision can only be read as Apple paving the way to
       | scanning all content on the device, to bypass the pesky "Backup
       | to iCloud" options being turned off.
       | 
       | [0] https://www.apple.com/child-safety/
        
         | vmladenov wrote:
         | > Since they claim they're only scanning iCloud content, why
         | not scan in the cloud?
         | 
         | Because (I suspect) this is a precursor to E2EE encrypted
         | iCloud Photos. Apple cannot plausibly claim it does not store
         | malicious E2EE content on its servers without some kind of
         | filter upon upload. This is that filter. Other services,
         | including the current implementation of iCloud Photos, skate by
         | because they do not allow E2EE photos.
        
       | everyone wrote:
       | When u upload any build to app store, before you can have it in
       | testflight or submit it for release, you have to fill out this
       | questionnaire asking "does your app use encryption?" If you say
       | yes, you're basically fucked, good luck releasing it.. You have
       | to say no as far as I'm aware.
        
       | arihant wrote:
       | I'm very concerned that a bunch of false positives will send
       | people's nudes to Apple for manual review. I don't trust apple's
       | on device ML for something this sensitive. I also can't imagine
       | that Apple will now not be forced to implement government forced
       | filtering and reporting on iMessage. And this will likely affect
       | others like WhatsApp because now governments know that there is a
       | way to do this on E2E.
       | 
       | What are some other fully encrypted photo options out there?
        
       | mactavish88 wrote:
       | What kind of amateur criminal would store illegal material in
       | their iCloud account?
        
       | [deleted]
        
       | Wowfunhappy wrote:
       | This isn't the _biggest_ issue at play, but one detail I can 't
       | stop thinking about:
       | 
       | > If an account held by a child under 13 wishes to send an image
       | that the on-device machine learning classifier determines is a
       | sexually explicit image, a notification will pop up, telling the
       | under-13 child that their parent will be notified of this
       | content. [...] For users between the ages of 13 and 17, a similar
       | warning notification will pop up, though without the parental
       | notification.
       | 
       | Why is it different for children under 13, specifically? The
       | 18-year cutoff makes sense, because turning 18 carries legal
       | weight in the US (as decided via a democratic process), but 13?
       | 
       | 13 is an age when _many_ parents start granting their children
       | more freedom, but that 's very much rooted in one's individual
       | culture--and the individual child. By giving parents fewer
       | options for 13-year-olds, Apple--a private company--is pushing
       | their views about parenting onto everyone else. I find that a
       | little disturbing.
       | 
       | ---
       | 
       | Note: I'm not (necessarily) arguing for greater restrictions on
       | 13-year-olds. Privacy for children is a tricky thing, and I have
       | mixed feelings about this whole scheme. What I know for sure,
       | however, is that I don't feel comfortable with Apple being the
       | one to decide "this thing we've declared an appropriate invasion
       | of privacy for a 12-year-old is not appropriate for a 13-year-
       | old."
        
         | websites2023 wrote:
         | The feature is opt-in. So, Apple isn't forcing anyone to do
         | anything.
        
           | Wowfunhappy wrote:
           | But you have fewer options if your child is 13 years old. Or
           | am I misunderstanding the article?
        
             | websites2023 wrote:
             | Parents need to expressly opt in to Communication Safety
             | when setting up a child's device with Family Sharing, and
             | it can be disabled if a family chooses not to use it.
        
               | Wowfunhappy wrote:
               | But what if the family _wants_ to use it on the device of
               | a 13-year-old child?
        
               | websites2023 wrote:
               | Parents cannot be notified when a child between the ages
               | of 13 and 17 views a blurred photo, though children that
               | are between those ages will still see the warning about
               | sensitive content if Communication Safety is turned on.
        
               | Wowfunhappy wrote:
               | Right, but that's my point about Apple making choices
               | around an arbitrary age cutoff.
        
               | websites2023 wrote:
               | I kind of agree with you there. Isn't it just another
               | toggle? Surely if the entire feature is opt-in, the
               | blurring could be a toggle, as well as the notification.
        
               | Wowfunhappy wrote:
               | The article didn't say that these age differences were
               | adjustable defaults, so I would presume not. If Apple put
               | in these restrictions to protect the privacy of older
               | children, making it adjustable would defeat the point.
               | 
               | (And by the way, I _respect_ Apple 's desire to protect
               | children's privacy from their parents, but forcing the
               | change at 13, for everyone, seems awfully prescriptive.
               | It's fundamentally a parenting decision, and Apple should
               | not be making parenting decisions.)
               | 
               | But if this can in fact be controlled by the parent
               | regardless of the child's age, that does resolve the
               | problem!
        
         | BluSyn wrote:
         | 13 isn't an arbitrary cut-off. It's established as law in the
         | US under COPPA. Similar to how 18 is the cut off for the other
         | features. Other countries may have different age ranges
         | according to local laws.
        
           | Wowfunhappy wrote:
           | Is COPPA relevant to this feature, though?
           | 
           | 18, to me, is different, because it's the point when your
           | parents have no legal authority, so of course they shouldn't
           | have any say over how you use your phone.
        
             | BluSyn wrote:
             | It's relevant as under 13 all online activity can legally
             | be monitored by a parent/guardian. 13-17 it's a gray area.
             | 
             | Some countries have adulthood set at age 20. All the
             | numbers are arbitrary in some sense, but local laws have to
             | dictate that.
        
               | Wowfunhappy wrote:
               | I know nothing about this, but I'm very surprised that
               | any type of supervision--which is what this really is--
               | would be legally unclear, for anyone who is a minor.
        
         | Ajedi32 wrote:
         | Yeah, the "your phone will check your personal files against an
         | opaque, unauditable, government-provided database and rat you
         | out if it gets a match" part of this is very concerning, but I
         | don't buy the EFF's arguments against the new parental control
         | features. End-to-end encrypted or not, if you're sending
         | messages to a minor you should expect that their parents can
         | read those messages.
        
         | jtsiskin wrote:
         | COPPA. Apple necessarily already has a special under-13
         | settings. There's also PG13, etc
        
       | rStar wrote:
       | i'm ashamed of every single apple employee who worked to make
       | this happen. their work will be used to subjugate the most
       | vulnerable among us. i hope you all hate yourselves forever for
       | your cowardice and immorality.
        
       | hncurious wrote:
       | Apple employees successfully pressured their employer to fire a
       | new hire and are petitioning to keep WFH.
       | 
       | https://www.vox.com/recode/2021/5/13/22435266/apple-employee...
       | 
       | https://www.vox.com/recode/22583549/apple-employees-petition...
       | 
       | Will they apply that energy and leverage to push back on this?
       | 
       | How else can this be stopped before it goes too far? Telling
       | people to "Drop Apple" is even less effective than "Delete
       | Facebook".
        
         | lijogdfljk wrote:
         | I doubt this will be as clean. A large swath of people will
         | defend this "for the children".
        
           | system2 wrote:
           | Definitely that's why they use the most vulnerable subject. I
           | can't think of anything more sensitive than this. Every
           | parent would be okay with this.
        
       | blintz wrote:
       | One disappointing development from a larger perspective is that
       | many privacy-preserving technologies (multi-party computing,
       | homomorphic encryption, hardware enclaves, etc) are actually
       | getting used to build tools that undermine once-airtight privacy
       | guarantees. E2E starts to become... whatever this is.
       | 
       | A more recent example is how private set intersection became an
       | easy way to get contact tracing tech everywhere while maintaining
       | an often perfunctory notion of privacy.
       | 
       | I wonder where large companies will take this next. It behooves
       | us cryptography/security people who actually care about not
       | walking down this slippery slope to fight back with tech of our
       | own.
       | 
       | This whole thing also somewhat parallels the previous uses of
       | better symmetric encryption and enclaves technologies for DRM and
       | copyright protection.
        
       | hungryforcodes wrote:
       | Am I bring cynical?
       | 
       | https://techcrunch.com/2021/04/28/apple-record-china-2021/
       | 
       | Apple's iPhone revenu just doubled from last year in China -- now
       | 17 billion. Thats not a small number. The play against Huawei has
       | done it's job, apparently -- it's quite mortally injured.
       | 
       | For sure the CCP would love to scan everyone's phones for files
       | or images it finds troubling and for sure every country will
       | eventually be allowed to have its own entries in this database or
       | even their own custom DB.
       | 
       | So my cynical side says...Apple just sold out. MASSIVELY. The
       | loosers -- everyone pretty much that buys their phones.
        
         | benzoate wrote:
         | The CCP can already scan everything server side - iCloud
         | encryption is weaker in China and the servers are controlled by
         | a different entity than Apple. Getting iPhones to scan for
         | illicit content doesn't help the CCP.
        
           | hungryforcodes wrote:
           | If I keep all my data sequestered on my phone -- which I'm
           | bound to do if I am privacy conscious -- then obviously
           | scanning the phone benefits the CCP.
        
       | mccorrinall wrote:
       | They are putting their own users under surveillance. Didn't
       | expect that from Apple.
        
       | thysultan wrote:
       | All that expansive "privacy" marketing undone by a single move.
        
       | scratchmyhead wrote:
       | If Apple broadcasts their surveillance strategy so publicly,
       | wouldn't criminals stop using Apple products and delete their
       | iCloud data immediately? Who will be left to "catch" at that
       | point? The most incompetent criminals?
       | 
       | I'm missing how this will actually work if perpetrators knew
       | Apple was going to analyze their data beforehand. Could someone
       | explain?
        
       | anupamchugh wrote:
       | By notifying parents of children under 13 for image abuse, looks
       | like Apple wants to be both the police and the parent of iPhone
       | owners.
        
       | xbar wrote:
       | Police-state-designed device.
        
       | djanogo wrote:
       | Why didn't apple just add option in screen time to block all
       | images in iMessage?, that would have let parents choose what's
       | best for their kids?
        
         | xanaxagoras wrote:
         | Because then they wouldn't be able hook these children in like
         | junkies as easily.
         | 
         | Having a hard time buying this is about "the kids" or children
         | in any way, shape or form. This is typical erosion of privacy
         | under a worn out flag, just more emotional manipulation.
         | 
         | Have you seen what smartphones have done to people, especially
         | children? Apple, Google, Facebook, Twitter, the whole lot of
         | them. They are out to destroy children, not save them. If they
         | thought they could "generate" 1 more dollar in "value" they'd
         | be selling these abhorrent images to the highest bidder.
        
       | triska wrote:
       | I remember an Apple conference where Tim Cook personally assured
       | us that Apple is fully committed to privacy, that everything is
       | so secure because the iPhone is so powerful that all necessary
       | calculations can happen on the device itself, and that we are
       | "not the product". I think the Apple CEO said some of this in the
       | specific context of speech processing, yet it seemed a specific
       | case of a general principle upheld by Apple.
       | 
       | I bought an iPhone because the CEO seemed to be sincere in his
       | commitment to privacy.
       | 
       | What Apple has announced here seems to be a complete reversal
       | from what I understood the CEO saying at the conference only a
       | few years ago.
        
         | avnigo wrote:
         | I'm still waiting on iCloud backup encryption they promised a
         | while back. There were reports that they scrapped those plans
         | because the FBI told them to, but nothing official announced
         | since 2019 on this.
        
           | minsc__and__boo wrote:
           | Yet Apple gave access to all the chinese user iCloud data to
           | the Chinese government, including messages, emails, pictures,
           | etc.
           | 
           | NYT Daily had an episode where they talked about how the CCP
           | is getting Apple to bend it's commitment to privacy:
           | 
           | https://www.nytimes.com/2021/06/14/podcasts/the-
           | daily/apple-...
        
           | rudian wrote:
           | Last I heard on HN was that it was scrapped entirely as a
           | consequence of some event I don't remember. I hope someone
           | has a better memory than I do.
        
         | Klonoar wrote:
         | I think the EFF is probably doing good by calling attention to
         | the issue, but let's... actually look at the feature before
         | passing judgement, e.g:
         | 
         | https://twitter.com/josephfcox/status/1423382200880439298/ph...
         | 
         | - It's run for Messages in cases where a child is potentially
         | viewing material that's bad.
         | 
         | - It's run _before upload to iCloud Photos_ - where it would've
         | already been scanned anyway, as they've done for years (and as
         | all other major companies do).
         | 
         | To me this really doesn't seem that bad. Feels like a way to
         | actually reach encrypted data all around while still meeting
         | the expectations of lawmakers/regulators. Expansion of the tech
         | would be something I'd be more concerned about, but considering
         | the transparency of it I feel like there's some safety.
         | 
         | https://www.apple.com/child-safety/ more info here as well.
        
           | aaomidi wrote:
           | > - It's run _before upload to iCloud Photos_ - where it
           | would've already been scanned anyway, as they've done for
           | years (and as all other major companies do).
           | 
           | Then why build this functionality at all? Why not wait until
           | it's uploaded and check it on their servers and not run any
           | client side code? This is how literally every other non-
           | encrypted cloud service operates.
        
             | Klonoar wrote:
             | I assume (and this is my opinion, to be ultra-clear) that
             | it's a blocker for E2E encryption. As we've seen before,
             | they wanted to do it by backed off after government
             | pressure. It wouldn't surprise me if this removes a
             | blocker.
             | 
             | Apple has shown that they prefer pushing things to be done
             | on-device, and in general I think they've shown it to be a
             | better approach.
        
               | aaomidi wrote:
               | That really makes little to no sense - it's not E2EE if
               | you're going to be monitoring files that enter the
               | encrypted storage. That's snakeoil encryption at that
               | point.
               | 
               | I sincerely doubt Apple is planning to do E2EE with
               | iCloud storage considering that really breaks a lot of
               | account recovery situations & is generally a bad UX for
               | non-technical users.
               | 
               | They're also already scanning for information on the
               | cloud anyway.
        
               | Klonoar wrote:
               | Eh, I disagree - your definition feels like moving the
               | goalposts.
               | 
               | Apple is under no obligation to host offending content.
               | Check it before it goes in (akin to a security checkpoint
               | in real life, I guess) and then let me move on with my
               | life, knowing it couldn't be arbitrarily vended out to x
               | party.
        
               | philistine wrote:
               | Going on with your life in this situation means police
               | officers have been given copies of the photos that
               | triggered the checkpoint. Do you want that?
        
               | Klonoar wrote:
               | Any image that would trigger _for this hashing aspect_
               | would already trigger _if you uploaded it to iCloud where
               | they currently scan it already_. Literally nothing
               | changes for my life, and it opens up a pathway to
               | encrypting iCloud contents.
        
               | pseudalopex wrote:
               | Apple's paper talks about decrypting suspect images. It
               | isn't end to end.[1]
               | 
               | [1] https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
        
               | Klonoar wrote:
               | Feel free to correct me if I'm wrong, but this is a
               | method for decrypting _if it's matching an already known
               | or flagged item_. It's not enabling decrypting arbitrary
               | payloads.
               | 
               | From your link:
               | 
               | >In particular, the server learns the associated payload
               | data for matching images, but learns nothing for non-
               | matching images.
               | 
               | Past this point I'll defer to actual cryptographers (who
               | I'm sure will dissect and write about it), but to me this
               | feels like a decently smart way to go about this.
        
               | pseudalopex wrote:
               | Matching means suspect. It doesn't have to be a true
               | match.
               | 
               | It could be worse. But end to end means the middle has no
               | access. Not some access.
        
               | aaomidi wrote:
               | And remember the E2EE is pure speculation at this point.
        
               | 8note wrote:
               | As long as your using an iPhone, apples got access. To be
               | E2E, the screen still needs to be showing the encrypted
               | values, not the real image
        
               | aaomidi wrote:
               | > To be E2E, the screen still needs to be showing the
               | encrypted values, not the real image
               | 
               | No that is literally not the definition of end to end
               | encryption.
               | 
               | End to end encryption means that only the final
               | recipients of data can see what the data is. In this
               | case, it's the user.
        
               | aaomidi wrote:
               | Then don't offer "E2EE"
        
               | rudian wrote:
               | From what I remember iCloud is only encrypted at rest but
               | not E2E. Apple can decrypt it anytime.
               | 
               | The password manager (Keychain) is the only fully
               | encrypted part of iCloud; If you lose your devices or
               | forget the main password, the manager will empty itself.
               | This does not happen with any other part of iCloud.
        
               | Klonoar wrote:
               | ...yes, I'm saying that I think they _can't_ get to E2EE
               | on iCloud without moving something like this client side.
        
           | xienze wrote:
           | > Expansion of the tech would be something I'd be more
           | concerned about
           | 
           | Yeah, and that's precisely what will happen. It always starts
           | with child porn, then they move on to "extremist content", of
           | which the term expands to capture more things on a daily
           | basis. Hope you didn't save that "sad Pepe" meme on your
           | phone.
        
           | kps wrote:
           | > _considering the transparency of it_
           | 
           | What transparency? Apple doesn't publish iOS source.
        
           | mapgrep wrote:
           | > It's run _before upload to iCloud Photos_ - where it
           | would've already been scanned anyway
           | 
           | Right, so ask yourself, why is it on the device? Why not just
           | scan on the server?
           | 
           | To me (agreeing with much of the commentary I've seen) the
           | likeliest answer is that they are confining the scan to pre
           | uploads now not for any technical reason but to make the
           | rollout palatable to the public. Then they're one update away
           | from quietly changing the rules. There's absolutely no reason
           | to do the scan on your private device if they plan to only
           | confine this to stuff they could scan away from your device.
        
             | klipklop wrote:
             | Well for one they get to use YOUR compute power (and
             | energy) to create the hashes. Sounds like a nice way to
             | save money.
        
           | karaterobot wrote:
           | Since nobody would ever object to it, protecting against
           | child abuse gets used as a wedge. As the article points out,
           | the way this story ends is with this very backdoor getting
           | used for other things besides preventing child abuse:
           | anything the government asks Apple to give them. It's an
           | almost inevitable consequence of creating a backdoor in the
           | first place, which is why you have to have a zero-tolerance
           | policy against it.
        
           | therealmarv wrote:
           | Now it will be "before upload". In 1-2 years it's "scan all
           | local photos" in the name of "make the World a better place".
           | It's such a small technical step for Apple to change this
           | scanning behaviour in the future and scan even offline
           | photos. All the necessary software is on all Apple i-devices
           | already by then.
           | 
           | Everybody is a potential criminal with photos on your phone
           | unless you prove otherwise by scanning. This is the future we
           | are heading to. To do the scanning on device is actually the
           | weakest point of their implementation IMHO.
        
           | [deleted]
        
           | mulmen wrote:
           | This seems even worse. If the images are only scanned before
           | upload to iCloud then Apple has opened a backdoor that
           | doesn't even give them any new capability. If I am
           | understanding this right an iPhone can still be used to
           | distribute CSAM as long as the user is logged out of iCloud?
           | So it's an overreach and ineffective?
        
           | randcraw wrote:
           | So your argument is, if you've done nothing wrong, you have
           | nothing to worry about. Really? Will you feel the same when
           | Apple later decides to include dozens more crimes that they
           | will screen for, surreptitiously? All of which are searches
           | without warrants or legal oversight?
           | 
           | Let me introduce you to someone you should know better. His
           | name is Edward Snowden. Or Louis Brandeis, who is spinning in
           | his grave right about now.
           | 
           | The US Fourth Amendment exists for a damned good reason.
        
             | fredgrott wrote:
             | Hmm, seems to me since most smart criminals understand not
             | to leave a digital footprint that what Apple will catch is
             | those are idiots and make a honest mistake and those how
             | are dumb and make a mistake in putting their illegality
             | online.
             | 
             | So I would ask US Lawmakers why cannot the phone companies
             | make the same commitments? As the reason seems to be we
             | have bad people doing crime using digital communication
             | devices.
             | 
             | Last time I checked the digital pipeline ie phone lines is
             | still under FFC rules is it not?
             | 
             | If they answer that its to hard tech wise then why cannot
             | Apple make the same argument ot law makers?
        
             | amznthrwaway wrote:
             | Argument by slippery slope is a type of logical fallacy,
             | but it is commonly used as an actual argument by
             | extremists. It's sad how many people these days are so
             | poorly educated and so bad at critical thought that they
             | mistake fallacies for actual arguments.
        
             | Klonoar wrote:
             | You do realize you could get this message across without
             | the needlessly arrogant tone, yeah? All it does is make me
             | roll my eyes.
             | 
             | Anyway, that wasn't my stated position. I simply pointed
             | out that this is done for a subset of users (where there's
             | already existing reasons to do so, sub-13 and all) and that
             | on syncing to iCloud this _already happens anyway_.
             | 
             | I would gladly take this if it removes a barrier to making
             | iCloud E2E encrypted; they are likely bound to do this type
             | of detection, but doing it client-side before syncing feels
             | like a sane way to do it.
        
               | kickopotomus wrote:
               | > I would gladly take this if it removes a barrier to
               | making iCloud E2E encrypted; they are likely bound to do
               | this type of detection, but doing it client-side before
               | syncing feels like a sane way to do it.
               | 
               | But there is an issue there. Now there is a process on
               | your phone capable of processing unencrypted data on your
               | phone and communicating with the outside world. That is
               | spyware which will almost certainly be abused in some
               | way.
        
               | xondono wrote:
               | > Now there is a process on your phone capable of
               | processing unencrypted data on your phone and
               | communicating with the outside world.
               | 
               | What? That's what all apps _by definition_ do. My retinas
               | can't do decryption yet!
        
               | MichaelZuo wrote:
               | From the proposal it seems that this system 1. cannot be
               | opted out of and 2. can run at any time the phone is
               | powered on without user consent.
               | 
               | That puts it clearly in a different category from apps.
        
               | dabbledash wrote:
               | E2E encryption isn't meaningful if third parties get to
               | look at the decrypted content.
        
               | JackGreyhat wrote:
               | Actaully, I don't think it will remove a barrier for
               | iCloud E2E encryption at all. On the contrary. All it
               | will remove, is the barrier for what we find acceptible
               | for companoes like Apple to implement. I think Apple made
               | a very intrusive move, one that we will come to accept
               | over time. After that, a next move follows...and so on.
               | That's the barrier being moved. A point will be reached
               | when E2E encryption is nothing more than a hoax, a non-
               | feature with no added value. A mirage of what it is
               | supposed to be. All of these things are implemented under
               | the Child Protection flag. Sure, we need child
               | protection, we need it badly, but the collateral is huge
               | and quite handy too for most 3 letter agencies. I don't
               | have the solution. The other day my 3 year old son had a
               | rash, I took pictures of it over the course of a few
               | days. A nude little boy, pictures from multiple angles. I
               | showed my dermatologist. What will happen in the future?
               | Will my iPhone "flag" me as a potential child predator?
               | Can I tell it I'm a worried dad? Do I even have to be
               | thinking about these things?
        
           | thesimon wrote:
           | "Feels like a way to actually reach encrypted data all around
           | while still meeting the expectations of lawmakers/regulators"
           | 
           | And isn't that a problem? Encrypted data should be secure,
           | even if lawmakers don't want math to exist.
        
             | Klonoar wrote:
             | Your data should be encrypted on Apple's servers and
             | unreadable by them; rather, this is my desire from Apple.
             | They are likely bound to scan and detect for this kind of
             | abusive content.
             | 
             | This handles that client-side instead of server side, and
             | if you don't use iCloud photos, it doesn't even affect you.
             | If syncing? Sure, decrypt it on device and check it before
             | uploading - it's going to their servers after all.
             | 
             | Don't want to even go near this? Don't use Message or
             | iCloud, I guess. Very possible to use iOS/iDevices in a
             | contained manner.
        
           | wayneftw wrote:
           | It runs on _my device_ and uses my CPU, battery time and my
           | network bandwidth (to download /upload the hashes and other
           | necessary artifacts).
           | 
           | I'd be fine with them scanning stuff I uploaded to them with
           | their own computers because I don't have any really
           | expectation of privacy from huge corporations.
        
             | kps wrote:
             | Apple already uses 'your' CPU, battery time and network
             | bandwidth for its Find My / AirTag product.
        
               | babesh wrote:
               | You can turn it off.
        
               | _carbyau_ wrote:
               | Is it just me that finds the list of "Things to turn
               | off." getting ridiculously long?
        
               | babesh wrote:
               | I think that is intentional. When they really want people
               | to do something they make it opt-out. When they don't
               | they make it opt-in.
        
               | leucineleprec0n wrote:
               | It's not just you. It's fucking enraging at this point. I
               | feel like I woke up one day and gleaned a fat look at
               | Finder or various iCloud/background service junk and just
               | realized it is to me what fucking bloatware ware of 2010
               | PC's (and presumably today) was/is.
               | 
               | I just want general purpose computation equipment @
               | reasonably modern specifications - albeit largely devoid
               | of rootUser-privileged advertisement stacks (included
               | libraries etc).
               | 
               | I mean what the fuck, is that so fucking hard? This is
               | hellworld, given the obviously plausible counter factual
               | where we just... don't... do this
        
               | d110af5ccf wrote:
               | > I just want general purpose computation equipment @
               | reasonably modern specifications
               | 
               | So ... LineageOS (without GApps) on Pixel or OnePlus
               | hardware? (Purchased directly from the manufacturer, not
               | a carrier!)
        
             | Klonoar wrote:
             | I feel like this argument really doesn't add much to the
             | discussion.
             | 
             | It runs only on a subset of situations, as previously noted
             | - and I would be _shocked_ if this used more battery than
             | half the crap running on devices today.
             | 
             | Do you complain that Apple runs code to find moments in
             | photos to present to you periodically...?
        
               | aaomidi wrote:
               | What is the point of running this on device? The issue
               | here is now Apple has built and is shipping what is
               | essentially home-phoning malware that can EASILY be
               | required with a court order to do something entirely than
               | what it is designed to do.
               | 
               | They're opening themselves to being forced by 3 letter
               | agencies around the world to do some really fucked up
               | shit to their users.
               | 
               | Apple should never have designed something that allows
               | for fingerprinting of files & users for stuff stored on
               | their own device.
        
               | Klonoar wrote:
               | Your entire argument could be applied to iOS itself. ;P
        
               | aaomidi wrote:
               | Not really, iOS didn't really have the capability of
               | scanning and reporting files based on a database received
               | by the FBI/other agencies.
               | 
               | There is a big difference when this has been implemented
               | & deployed to devices. Fighting questionable subpoenas
               | and stuff becomes easier when you don't have the
               | capability.
        
               | m4rtink wrote:
               | Given iOS being totally proprietary on a heavily locked
               | down device making inspecting even _the binary blobs_
               | complicated, how can anyone can be sure what it is doing
               | ? Not to mention any such capability missing now is just
               | one upgrade away from being added with no ability for the
               | user to inspect and reject it.
        
               | aaomidi wrote:
               | No one is. Just that it's harder to develop tech like
               | this without it leaking.
               | 
               | So apple is just outright saying it is.
        
               | wayneftw wrote:
               | > I feel like this argument really doesn't add much to
               | the discussion.
               | 
               | Oh, I guess I should have just regurgitated the Apple
               | press release like the gp?
               | 
               | > It runs only on a subset of situations...
               | 
               | For now. But how does that fix the problem of them using
               | my device and my network bandwidth?
               | 
               | > I would be _shocked_ if this used more battery than
               | half the crap running on devices today.
               | 
               | You think you'll be able to see how much it uses?
               | 
               | > Do you complain that Apple runs code to find moments in
               | photos to present to you periodically...?
               | 
               | Yes. I hate that feature, it's a waste of my resources.
               | I'll reminisce when I choose to, I don't need some
               | garbage bot to troll my stuff for memories. I probably
               | already have it disabled, or at least the notifications
               | of it.
        
           | lights0123 wrote:
           | My big issue is what it opens up. As the EFF points out, it's
           | really not a big leap for oppressive governments to ask Apple
           | to use the same tech (as demoed by using MS's tech to scan
           | for "terrorist" content) to remove content they don't like
           | from their citizens' devices.
        
             | dwaite wrote:
             | First, standard disclaimer on this topic that there were
             | multiple independent technologies announced - I assume you
             | are speaking to content hash comparisons on photo upload
             | specifically to Apple's photo service, which they are doing
             | on-device vs in-cloud.
             | 
             | How is this situation different from an oppressive
             | government "asking" (which is a weird way we now use to
             | describe compliance with laws/regulations) for this sort of
             | scanning in the future?
             | 
             | Apple's legal liability and social concerns would remain
             | the same. So would the concerns of people under the regime.
             | Presumably the same level of notification and ability of
             | people to fight this new regulation would also be present
             | in both cases.
             | 
             | Also, how is this feature worse than other providers which
             | already do this sort of scanning on the other side of the
             | client/server divide? Presumably Apple does it this way so
             | that the photos remain encrypted on the server, and release
             | of data encryption keys is a controlled/auditable event.
             | 
             | You would think the EFF would understand that you can't use
             | technical measures to either fully enforce or successfully
             | defeat regulatory measures.
        
             | acdha wrote:
             | That's my concern: what happens the first time a government
             | insists that they flag a political dissident or symbol? The
             | entire system is opaque by necessity for its original
             | purpose but that seems to suggest it would be easy to do
             | things like serve a custom fingerprints to particular users
             | without anyone being any the wiser.
        
               | philistine wrote:
               | My heart goes to the queer community of Russia, whose
               | government will pounce on this technology in a heartbeat
               | and force Apple to scan for queer content.
        
               | acdha wrote:
               | They'd have many other countries keeping them company,
               | too.
               | 
               | One big mess: how many places would care about false
               | positives if that gave them a pretext to arrest people? I
               | do not want to see what would happen if this
               | infrastructure had been available to the Bush
               | administration after 9/11 and all of the usual ML failure
               | modes played out in an environment where everyone was
               | primed to assume the worst.
        
           | vimy wrote:
           | Teens are also children. Apple has no business checking if
           | they send or receive nude pics. Let alone tell their parents.
           | This is very creepy behavior from Apple.
           | 
           | Edit: I'm talking about this https://pbs.twimg.com/media/E8DY
           | v9hWUAksPO8?format=jpg&name=...
        
             | xondono wrote:
             | Call me crazy, but if your 13yo is sending nudes, I think
             | that as a parent you want to know that.
             | 
             | Current society is pushing a lot of adult behavior into
             | kids, and they don't always understand the consequences of
             | their actions.
             | 
             | Parents can't inform their kids if they aren't aware.
        
               | Guest19023892 wrote:
               | > Parents can't inform their kids if they aren't aware
               | 
               | Why not inform your children of the potential
               | consequences when giving them a phone? Why do you need
               | Apple to notify you of inappropriate behavior before
               | having that conversation? That's like waiting until you
               | find a pregnancy test in the garbage before talking to
               | your children about sex.
        
               | xondono wrote:
               | Yes, that's great, but that's not how kids work,
               | especially 13yo..
        
               | fortuna86 wrote:
               | I wouldn't give a 13 year old an iphone, but I guess that
               | makes me odd.
        
               | rudian wrote:
               | Yeah good luck with that.
        
               | emadabdulrahim wrote:
               | Wanting to know as a parent, and the way Apple is going
               | about this are two different issues.
               | 
               | The government also wants to know about potential
               | terrorist attacks. Why not scan all our phones for all
               | kinds of data to protect innocent people from being
               | killed by mass shootings?
               | 
               | That's nonsense. I'm saying that and I'm deeply locked in
               | Apple's ecosystem. Which is pissing me off.
        
             | hb0ss wrote:
             | Children as defined by Apple differs per legal region, for
             | the US it is set to 13 years or younger. Also, your parents
             | need to have added your account to the iCloud family for
             | the feature to work.
        
             | Closi wrote:
             | That's not what this solution is doing, it's checking a
             | hash of the photo against a hash of known offending
             | content.
             | 
             | If someone sends nude pics there is still no way to tell
             | that it's a nude pic.
        
               | [deleted]
        
               | randcraw wrote:
               | Nude pic ID is routine online. Facebook developed this
               | capability over 5 years ago and employs it liberally
               | today, as do many other net service providers.
        
               | mrits wrote:
               | Not true. We don't know how the fuzzy hash is working.
               | It's very likely a lot of nudes would fall in the
               | threshold Apple has set.
        
               | krrrh wrote:
               | That's only the first part of what was announced and
               | addressed in the article.
               | 
               | The other part is on-device scanning for nude pics a
               | child is intending to send using machine learning and
               | securely notifying the child, and then parents within the
               | family account. The alert that the kids get by itself
               | will probably be enough to stop a lot of them from
               | sending the pic in the first place.
        
               | slownews45 wrote:
               | I'm a parent. It's weird seeing HN push against this.
               | 
               | This sounds like a feature I'd like
        
               | foxpurple wrote:
               | Under 13 seems like an ok cutoff but I'd be very
               | concerned that they push it to under 18. Pretty much
               | everyone is sharing adult context responsibly at 17.
        
               | _carbyau_ wrote:
               | If you are worried about the photos your kids are
               | sending/receiving the problem isn't in the device.
               | 
               | I won't use my own kids devices as information weapons
               | against them. I will teach them what kind of information
               | weapons they are however.
        
               | dwaite wrote:
               | If you are talking about parents giving their children a
               | device to use as a weapon against them, you are implying
               | some really bad parenting and really bad general human
               | behavior by some other parents.
               | 
               | Not that there aren't some really lousy parents out
               | there.
               | 
               | I suspect/hope the exclusion parenting style (you don't
               | get a phone because I don't trust you with it or the rest
               | of the world with you) and the guidance parenting style
               | (you get a phone but I'm going to make sure to talk to
               | you about responsibility and about lousy people in the
               | world, and I want to know if something happens so we can
               | keep talking) both far outweigh this sort of proposed
               | 'entrapment' parenting style.
        
               | [deleted]
        
               | philistine wrote:
               | I agree with you in principle, but I also know that kids
               | will soon find methods of sharing that defeat any scans.
               | Other apps and ephemeral websites can be used to escape
               | Apple's squeaky-clean version of the world.
        
               | slownews45 wrote:
               | Sure - that's fine.
               | 
               | But if I'm picking a phone for my kid, and my choice is
               | this (even if imperfect) and the HN freedomFone - it's
               | going to be Apple. We'll see what other parents decide.
        
               | pzo wrote:
               | @philistine is right that apple solution can only worth
               | for short time being. Parents want easy automatic
               | solution with "out of sight out of mind", but the only
               | solution that will work long-term is talking with kid and
               | educating them instead of out-sourcing parenting to tech
               | company.
               | 
               | It's easily to design solution that will work around and
               | allow sending either criminals or kids nude pictures
               | under the radar, e.g.
               | 
               | 1) Someone will eventually write web app that will use
               | webrtc or similar for snapping nude picture (so that no
               | picture is stored on device) 2) encrypting those photos
               | on the server 3) sending the link to this nude image that
               | will be rendered in html canvas (again no image will be
               | stored on device) 4) link to web app that will rendered
               | image will be behind captcha so that automated bots
               | cannot scan it
               | 
               | Now do we want to go into the rabbit hole to 'protect the
               | kids' and make Apple buildin camera drivers that will
               | filter all video stream for nudity?
        
               | mrtranscendence wrote:
               | > the only solution that will work long-term is talking
               | with kid and educating them instead of out-sourcing
               | parenting to tech company.
               | 
               | Sometimes otherwise good, smart kids, with good parents,
               | can be caught up by predation. It isn't always as simple
               | as making sure the kid is educated; they'll still be
               | young, they'll still be immature (< 13!), and it will
               | sometimes be possible to manipulate them. I'm not saying
               | that what Apple is doing is a _good_ answer, it probably
               | isn 't, but it's _an_ answer to a genuine felt need.
        
               | Bud wrote:
               | False, and this shows that you didn't read the entire
               | article. You should go and do that.
        
               | [deleted]
        
               | artimaeis wrote:
               | You're conflating the CSAM detection of photos uploaded
               | to iCloud with the explicit detection for child devices.
               | The latter is loosely described here:
               | https://www.apple.com/child-safety/.
               | 
               | > Messages uses on-device machine learning to analyze
               | image attachments and determine if a photo is sexually
               | explicit. The feature is designed so that Apple does not
               | get access to the messages.
        
               | Closi wrote:
               | Ah yeah you are right, two posts about different Apple
               | features for photo scanning on a single day has thrown
               | me!
        
             | dragonwriter wrote:
             | > Apple has no business checking if they send or receive
             | nude pics.
             | 
             | Furthermore, if Apple is _deliberately_ , _by design_
             | resending them to additional people beyond the addressee,
             | as a feature it _markets to the people that will be
             | receiving them_ , that seems to...raise problematic issues.
        
               | dwaite wrote:
               | Who said Apple re-sends pictures?
        
             | Spooky23 wrote:
             | It would be if that were what they were doing. They are
             | not.
        
               | Bud wrote:
               | Yes, it is, and you need to read the entire article.
        
               | spiderice wrote:
               | I think you're both partially right.
               | 
               | > In these new processes, if an account held by a child
               | under 13 wishes to send an image that the on-device
               | machine learning classifier determines is a sexually
               | explicit image, a notification will pop up, telling the
               | under-13 child that their parent will be notified of this
               | content. If the under-13 child still chooses to send the
               | content, they have to accept that the "parent" will be
               | notified, and the image will be irrevocably saved to the
               | parental controls section of their phone for the parent
               | to view later. For users between the ages of 13 and 17, a
               | similar warning notification will pop up, though without
               | the parental notification.
               | 
               | This specifically says that it will not notify the
               | parents of teens, as GGP claims. So GP is right that
               | Apple isn't doing what GGP claimed. However I still think
               | you might be right that GP didn't read the full article
               | and just got lucky. Lol.
        
             | krrrh wrote:
             | Parents do have a legal and moral responsibility to check
             | on their children's behaviour, and that includes teens.
             | It's somewhat analogous to a teacher telling parents about
             | similar behaviour taking place at school.
             | 
             | I suspect a lot of how people feel about this will come
             | down to whether they have kids or not.
        
               | anthk wrote:
               | I don't know about your country but in mine in Europe
               | teens have privacy rights OVER their parents' paranoia.
               | 
               | That includes secrecy in private communications and OFC
               | privacy within their own data in smartphones.
        
             | dhosek wrote:
             | The fact that teens are children means that if, say a 16-yo
             | sends a nude selfie to their s.o., they've just committed a
             | felony (distributing child pornography) that can have
             | lifelong consequences (thanks to hysterical laws about sex
             | offender registries, both kids could end up having to
             | register as sex offenders for the rest of their life and
             | will be identified as having committed a crime that
             | involved a minor. Few if any of the registries would say
             | more than this and anyone who looks in the registry will be
             | led to believe that they molested a child and not shared a
             | selfie or had one shared with them). The laws may not be
             | just or correct, but they are the current state of the
             | world. Parents need to talk to their kids about this sort
             | of thing, and this seems one of the less intrusive way for
             | them to discover that there's an issue. If it were
             | automatically shared with law enforcement? That would be a
             | big problem (and a guarantee that my kids don't get access
             | to a device until they're 18), but I'm not ready1 to be up
             | in arms about this yet.
             | 
             | 1. I reserve the right to change my mind as things are
             | revealed/developed.
        
               | jfjsjcjdjejcosi wrote:
               | > if, say a 16-yo sends a nude selfie to their s.o.,
               | they've just committed a felony ... The laws may not be
               | just or correct, but they are the current state of the
               | world.
               | 
               | Hence, strong E2E encryption designed to prevent unjust
               | government oppression, _without_ backdoors.
               | 
               | Parents should talk to their teenagers about sex
               | regardless of if they get a notification on their phone
               | telling them they missed the boat.
        
               | SahAssar wrote:
               | I get your points, but the end result is that the client
               | in an E2EE system can no longer be fully trusted to act
               | on the clients behalf. That seems alarming to me.
        
               | philistine wrote:
               | I'd argue that the problem of minors declared sex
               | offenders for nude pictures has reached a critical mass
               | that scares me. At this point, sex offenders of truly
               | vile things can hide by saying that they are on a sex
               | offender registry because of underage selfies. And I
               | think most people will believe them.
        
               | wccrawford wrote:
               | I worked with someone that claimed this, years ago. And
               | they were still young enough that I believed them.
        
               | joombaga wrote:
               | Am I missing something? Why would they say they're sex
               | offenders at all?
        
               | cormacrelf wrote:
               | As I understand it, one of the primary purposes of sex
               | offender registries is to get information out about who
               | is on them. I believe some people are forced to knock on
               | doors in their neighbourhood and disclose it. In other
               | situations they would just be getting ahead of the story.
        
               | emadabdulrahim wrote:
               | They wouldn't. But if they were sex offenders, they could
               | claim that their offense was simply sending a nude when
               | they were 16. While their real offense may have been
               | rape.
               | 
               | I don't actually know what the law is in regard to sex
               | offense. I'm simply explaining what I understood from the
               | previous comment.
        
               | anthk wrote:
               | > The laws may not be just or correct, but they are the
               | current state of the world
               | 
               | America, not Europe. Or Japan.
        
               | bambax wrote:
               | > _they 've just committed a felony (distributing child
               | pornography)_
               | 
               | In the US, maybe (not sure if this is even true in all
               | states), but not in most other countries in the world,
               | where a 16-year-old is not a child, nudity is not a
               | problem, and "sex offender registries" don't exist.
               | 
               | The US is entitled to make its own (crazy, ridiculous,
               | stupid) laws, but we shouldn't let them impose those on
               | the rest of us.
        
               | hetspookjee wrote:
               | Yet for the largest part this is where we are ending up.
               | Just look at Facebook and Twitter deciding what is right
               | and wrong. I think that's wrong in a lot of ways but
               | appearantly there is very little the EU and others can do
               | about it.
        
               | bambax wrote:
               | There is a lot the EU could do if it wanted to, or had
               | any kind of courage. But we are weak and fearful.
        
             | nerdponx wrote:
             | > Apple has no business checking if they send or receive
             | nude pics. Let alone tell their parents.
             | 
             | Some people might disagree with you.
             | 
             | There are people out there who are revolted by the
             | "obviously okay" case of 2 fully-consenting teenagers
             | sending each other nude pics, without any coercion, social
             | pressure, etc.
             | 
             | Not to mention all the gray areas and "obviously not okay"
             | combinations of ages, circumstances, number of people
             | involved, etc.
        
               | slownews45 wrote:
               | It will be the parents who are deciding this -
               | particularly if they are buying these phones.
               | 
               | If parents don't like this feature, they can buy a
               | lineage OS type phone. If parents do they will buy this
               | type of phone for their kids.
        
               | philistine wrote:
               | Big correction: it will be the other kids' parent who
               | will decide for your kid. Apple will give your children's
               | picture to the other kid's parents.
               | 
               | That's terrifying.
        
               | slownews45 wrote:
               | What a lie - I'm really noticing an overlap between folks
               | fighting this type of stuff (which as a parent I want)
               | and folks just lying horribly.
               | 
               | "if a child attempts to send an explicit photo, they'll
               | be warned before the photo is sent. Parents can also
               | receive a message if the child chooses to send the photo
               | anyway." - https://techcrunch.com/2021/08/05/new-apple-
               | technology-will-...
               | 
               | So this gives kids a heads up that they shouldn't send
               | it, and that if they do, their parent will be notified.
               | So that's me in case you are not reading this clearly.
               | 
               | Now yes, if someone is SENDING my child porn from a non-
               | child account, I as the parent will be notified. Great.
               | 
               | If this is terrifying - that's a bit scary! HN is going a
               | bit off the rails these days.
               | 
               | Allow me to make a prediction - users are going to like
               | this - it will INCREASE their trust in apple in terms of
               | a company trying to keep them and their family safe.
               | 
               | I just looked it up - Apple is literally the #1 brand
               | globally supposedly in 2021. So they are doing the right
               | thing in customers minds so far.
        
               | josephcsible wrote:
               | Nobody's lying. Consider the scenario where the sender is
               | an Android user and the recipient is an iOS user.
        
               | dwaite wrote:
               | Could you elaborate on what you think would happen in
               | this case, vs. an iOS user sending an image to another
               | iOS user?
        
               | jlokier wrote:
               | Presumably the sender won't get a notification that the
               | picture they are about to send will be flagged, but it
               | will be flagged by the recipient's device automatically.
               | 
               | Neither participant will have the opportunity to be
               | warned in advance and avoid getting flagged.
        
               | thomastjeffery wrote:
               | That's entirely GP's point: preferring to cater to those
               | people affects the rest of us in a way we find
               | detrimental.
        
             | Klonoar wrote:
             | You describe it as if Apple's got people in some room
             | checking each photo. It's some code that notifies their
             | parents in certain situations. ;P
             | 
             | I know several parents in just my extended circle alone
             | that would welcome the feature, so... I just don't think I
             | agree with this statement. These parents already resort to
             | other methods to try and monitor their kids but it's
             | increasingly (or already) impossible to do so.
             | 
             | I suppose we should also take issue with Apple letting
             | parents watch their kids location...?
        
             | strictnein wrote:
             | This is not how the system works at all.
        
               | vimy wrote:
               | https://pbs.twimg.com/media/E8DYv9hWUAksPO8?format=jpg&na
               | me=... What am I reading wrong?
        
               | strictnein wrote:
               | Already answered here:
               | https://news.ycombinator.com/item?id=28079919
        
               | Bud wrote:
               | Answered incorrectly. You need to read the rest of the
               | article.
        
           | achow wrote:
           | There are always scenarios that one cannot catch. EFF
           | highlights one such.
           | 
           | It sounds like it could be quite common. And it could be an
           | absolute nightmare scenario for the kid who does not have the
           | feature turned on.
           | 
           |  _This means that if--for instance--a minor using an iPhone
           | without these features turned on sends a photo to another
           | minor who does have the features enabled, they do not receive
           | a notification that iMessage considers their image to be
           | "explicit" or that the recipient's parent will be notified.
           | The recipient's parents will be informed of the content
           | without the sender consenting to their involvement.
           | Additionally, once sent or received, the "sexually explicit
           | image" cannot be deleted from the under-13 user's device._
        
           | suizi wrote:
           | As many, many people have pointed out, building a mechanism
           | to scan things client-side is something which could easily be
           | extended to encrypted content, and perhaps, is intended to be
           | extended at a moment's notice to encrypted content, if they
           | see an opportunity to do so.
           | 
           | It's like having hundreds of nukes ready for launch, as
           | opposed to having the first launch being a year away.
           | 
           | If they wanted to "do it as all major companies do", then
           | they could have done it on the server-side, and there
           | wouldn't have been a debate about it at all, although it is
           | still extremely questionable, as far as privacy is concerned.
        
           | dabbledash wrote:
           | The point of encrypted data is not to be "reached."
        
           | ElFitz wrote:
           | True. But, first, it also means anyone, anywhere, as long as
           | they use iOS, is vulnerable to what the _US_ considers to be
           | proper. Which, I will agree, likely won't be an issue in the
           | case of child pornography. But there's no way to predict how
           | that will evolve (see Facebook's ever expanding imposing of
           | American cultural norms and puritanism).
           | 
           | Next, it also means they _can_ do it. And if it can be done
           | for child pornography, why not terrorism? And if it can be
           | done for the US' definition of terrorism, why not China 's,
           | Russia's or Saudi Arabia's? And if terrorism and child
           | pornography, why not drugs consumption? Tax evasion? Social
           | security fraud? Unknowingly talking with the wrong person?
           | 
           | Third, there _apparently_ is transparency on it today. But
           | who is to say it 's possible expansion won't be forcibly
           | silenced in the same way Prism's requests were?
           | 
           | Fourth, but that's only because I slightly am a maniac, how
           | can anyone unilaterally decide to waste the computing power,
           | battery life and data plan of a device I paid for without my
           | say so? (probably one of my main gripes with ads)
           | 
           | All in all, it means I am incorporating into my everyday life
           | a device that can and will actively snoop on me and
           | potentially snitch on me. Now, while I am not worried _today_
           | , it definitely paves the way for many other things. And I
           | don't see why I should trust anyone involved to stop here or
           | let me know when they don't.
        
             | adventured wrote:
             | The US is very openly, publicly moving down the road called
             | The War on Domestic Terrorism, which is where the US
             | military begins targeting, focusing in on the domestic
             | population. The politicians in control right now are very
             | openly stating what their plans are. It's particularly
             | obvious what's about to happen, although it was obvious at
             | least as far back as the Patriot Act. The War on Drugs is
             | coming to an end, so they're inventing a new fake war to
             | replace it, to further their power. The new fake war will
             | result in vast persecution just as the last one did.
             | 
             | You can be certain what Apple's scanning is going to be
             | used for is going to widen over time. That's one of the few
             | obvious certainties with this. These things are a Nixonian
             | wet dream. The next Trump type might not be so politically
             | ineffectual; more likely that person will be part of the
             | system and understand how to abuse & leverage it to their
             | advantage by complying with it rather than threatening its
             | power as an outsider. Trump had that opportunity, to give
             | the system what it wanted, he was too obtuse and rigid, to
             | understand he had to adapt or the machine would grind him
             | up (once he started removing the military aparatus that was
             | surrounding him, like Kelly and Mattis, it was obvious he
             | would never be allowed to win a second term; you can't keep
             | that office while being set against all of the military
             | industrial complex including the intelligence community,
             | it'll trip you up on purpose at every step).
             | 
             | The US keeps getting more authoritarian over time. As the
             | government gets larger and more invasive, reaching ever
             | deeper into our lives, that trend will continue. One of the
             | great, foolish mistakes that people make about the US is
             | thinking it can be soft and cuddly like Finland. Nations
             | and their governments are a product of their culture. So
             | that's not what you're going to get if you make the
             | government in the US omnipotent. You're going to get either
             | violent Latin American Socialism (left becomes dominant) or
             | violent European Fascism (right becomes dominant). There's
             | some kind of absurd thinking that Trump was right-wing, as
             | in anti-government or libertarian; Trump is a proponent of
             | big government, just as Bush was, that's why they had no
             | qualms about spending like crazy (look at the vast
             | expansion of the government under Bush); what they are is
             | the forerunners to fascism (which is part of what their
             | corporatism is), they're right wingers that love big
             | government, a super dangerous cocktail. It facilitates a
             | chain of enabling over decades; they open up pandora boxes
             | and hand power to the next authoritarian. Keep doing that
             | and eventually you're going to get a really bad outcome
             | (Erdogan, Chavez, Putin, etc) and that new leadership will
             | have extraordinary tools of suppression.
             | 
             | Supposed political extremists are more likely to be the
             | real target of what Apple is doing. Just as is the case
             | with social media targeting & censoring those people. The
             | entrenched power base has zero interest in change, you can
             | see that in their reaction to both Trump and Sanders. Their
             | interest is in maintaining their power, what they've built
             | up in the post WW2 era. Trump and Sanders, in their own
             | ways, both threatened what they constructed. Trump's chaos
             | threatened their built-up system, so the globalists in DC
             | are fighting back, they're going to target what they
             | perceive as domestic threats to their system, via their new
             | War on Domestic Terrorism (which will actually be a
             | domestic war on anyone that threatens their agenda). Their
             | goal is to put systems in place to ensure another outsider,
             | anyone outside of their system, can never win the
             | Presidency (they don't care about left/right, that's a
             | delusion for the voting class to concern themselves about;
             | the people that run DC across decades only care if the
             | left/right winner complies with their agenda; that's why
             | the Obamas and Clintons are able to be so friendly with the
             | Bushes (what Bush did during his Presidency, such as Iraq,
             | is dramatically worse than anything Trump did, and yet Bush
             | wasn't impeached, wasn't pursued like Trump was, the people
             | in power - on both sides - widely supported his move on
             | Iraq), they're all part of the same system so they
             | recognize that in eachother, and reject a Trump or Sanders
             | outsider like an immune system rejecting a foreign object).
             | 
             | The persistent operators in DC - those that continue to
             | exist and push agenda regardless of administration hand-
             | offs - don't care about the floated reason for what Apple
             | is doing. They care about their power and nothing else.
             | That's why they always go to the Do It For The Kids
             | reasoning, they're always lying. They use whatever is most
             | likely to get their agenda through. The goal is to always
             | be expanding the amount of power they have (and that
             | includes domestically and globally, it's about them, not
             | the well-being of nations).
             | 
             | We're entering the era where all of these tools of
             | surveillence they've spent the past few decades putting
             | into place, will start to be put into action against
             | domestic targets en masse, where surveillence tilts over to
             | being used for aggressive suppression. That's what Big Tech
             | is giddily assisting with the past few years, the beginning
             | of that switch over process. The domestic population
             | doesn't want the forever war machine (big reasons Trump &
             | Sanders are so popular, is that both ran on platforms
             | opposed to the endless foreign wars); the people that run
             | DC want the forever war machine, it's their machine, they
             | built it. Something is going to give, it's obvious what
             | that's going to be (human liberty at home - so the forever
             | wars, foreign adventurism can continue unopposed).
             | 
             | Systems of power always act to defend and further that
             | power. Historically (history of politics, war, governmental
             | systems) or psychologically (the pathology of power
             | lusting) there isn't anything surprising about any of it,
             | other than perhaps that so many are naive about it. I
             | suspect most of that supposed naivety is actually fear of
             | confrontation though (you see the same thing in the
             | security/privacy conflicts), playing dumb is a common form
             | of self-defense against confrontation. To recognize the
             | growing authoritarianism, requires a potent act of
             | confrontation mentally (and then you either have to put
             | yourself back to sleep (which requires far more effort), or
             | deal with the consequences of that stark reality laid
             | bare).
        
             | kmlx wrote:
             | > it also means anyone, anywhere,
             | 
             | it's only in the US. not in Europe or ROW.
        
             | chrsstrm wrote:
             | > is vulnerable to what the US considers to be proper
             | 
             | This stirs up all sorts of questions about location and the
             | prevailing standards in the jurisdiction you're in. Does
             | the set of hashes used to scan change if you cross an
             | international border? Is the set locked to whichever
             | country you activate the phone in? This could be a travel
             | nightmare.
        
               | judge2020 wrote:
               | As this isn't a list of things the U.S. finds prudish,
               | but actual images of children involved in being/becoming
               | a victim of abuse, it doesn't look like there are
               | borders, at least according to the official Apple
               | explanation[0].
               | 
               | If the situation OP suggests happens in the form of
               | FBI/other orgs submitting arguably non-CSAM content, then
               | Apple wouldn't be complicit or any wiser to such an
               | occurrence unless it was after-the-fact. If it happens in
               | a way where Apple decides to do this on their own dime
               | without affecting other ESPs, I imagine they wouldn't
               | upset CCP by applying US guidance to Chinese citizens'
               | phones.
               | 
               | 0: https://www.apple.com/child-safety/
        
               | makeitdouble wrote:
               | It's still a US database, and from it's goal it would fit
               | US defintion of child abuse.
               | 
               | I am no expert of what it means to the US, but I'd assume
               | there can be a lot of definition of what "child", "abuse"
               | and "material" mean depending on beliefs.
        
             | Klonoar wrote:
             | I think your points are mostly accurate, and that's why I
             | led with the bit about the EFF calling attention to it.
             | Something like this shouldn't happen without scrutiny.
             | 
             | The only thing I'm going to respond to otherwise is this:
             | 
             | >Fourth, but that's only because I slightly am a maniac,
             | how can anyone unilaterally decide to waste the computing
             | power, battery life and data plan of a device I paid for
             | without my say so? (probably one of my main gripes with
             | ads)
             | 
             | This is how iOS and apps in general work - you don't really
             | control the amount of data you're using, and you never did.
             | Downloading a changeset of a hash database is not a big
             | deal; I'd wager you get more push notifications with data
             | payloads in a day than this would be.
             | 
             | Battery life... I've never found Apple's on-device
             | approaches to be the culprit of battery issues for my
             | devices.
             | 
             | I think I'd add to your list of points: what happens when
             | Google inevitably copies this in six months? There really
             | is no competing platform that comes close.
        
               | ElFitz wrote:
               | > This is how iOS and apps in general work - you don't
               | really control the amount of data you're using, and you
               | never did. Downloading a changeset of a hash database is
               | not a big deal; I'd wager you get more push notifications
               | with data payloads in a day than this would be.
               | 
               | Oh, definitely. But I am given the ability to remove
               | those apps, or to disable these notifications, and I
               | consider the ones I leave to be of some value to me?
               | This? On my phone? It's literal spyware.
               | 
               | But, as I said, it's only because I am a maniac regarding
               | how tools should behave.
               | 
               | The point you add about Google, however, is a real issue.
               | I've seen some people mention LineageOS and postmarketOS.
               | But isn't really a solution for most people.
        
               | Klonoar wrote:
               | To be clear: don't use iCloud Photos and you won't get
               | compared to hashes.
               | 
               | This is, currently, the same opt-out of hash checking
               | anyway, as it's already checked on the server.
        
               | falcolas wrote:
               | > what happens when Google inevitably copies this in six
               | months? There really is no competing platform that comes
               | close.
               | 
               | Then you have to make a decision about what matters more.
               | Convenience and features, or privacy and security.
               | 
               | I've made that decision myself. I'll spend a bit more
               | time working with less-than-perfect OSS software and
               | hardware to maintain my privacy and security.
        
             | tshaddox wrote:
             | The problem with the "there's no way to predict how this
             | will evolve" argument is that it would apply equally as
             | well years before this was announced, and to literally
             | anything that Apple could theoretically do with software on
             | iPhones.
        
               | m4rtink wrote:
               | Well it does - people have been pointing out downfalls of
               | the walled garden, locked boot loaders and proprietary
               | everything on "iDevices" for years, pointing to scenarios
               | similar to the one unfolding right now.
        
               | ElFitz wrote:
               | That's the problem with closed-source software in
               | general, and one that can be remotely updated in
               | particular.
               | 
               | And I am writing that as a (now formerly?) happy iPhone
               | user. It's just that I don't trust it or Apple as much
               | anymore.
               | 
               | And although there is no way of predicting with certainty
               | how it will evolve, most past successful similar
               | processes and systems usually went down the anti-
               | terrorism & copyright enforcement roads.
        
               | tremon wrote:
               | Indeed. That's why some people have been consistently
               | arguing against proprietary software and against locked-
               | down platforms for decades.
        
             | hsousa wrote:
             | The US is imposing puritanism? Oh man, on which planet do
             | you live?
        
               | rudian wrote:
               | Try not living in the US for example.
        
               | ElFitz wrote:
               | The one on which a social network used by nearly 3
               | billion people worldwide (Facebook) bans pictures of
               | centuries old world famous paintings containing naked
               | women, as if it were pornography.
               | 
               | The one on which a video hosting platform used by over 2
               | billion people (YouTube) rates content as 18+ as soon as
               | it, even briefly, shows a pair of breasts.
               | 
               | Why? Which planet do you live on?
        
             | pseudalopex wrote:
             | What transparency? The algorithm is secret. The reporting
             | threshold is secret. The database of forbidden content is
             | secret.
        
               | kanbara wrote:
               | https://www.apple.com/child-
               | safety/pdf/Apple_PSI_System_Secu... ?
        
               | pseudalopex wrote:
               | The secret threshold is a parameter in that protocol. It
               | runs after the secret hash algorithm.
        
               | ElFitz wrote:
               | Like always with Apple. But at least they're saying they
               | are doing it and supposedly detailing how.
               | 
               | Regarding the amount of secrecy on the how, it's the way
               | they do everything.
               | 
               | As for "sharing" the database of "forbidden content",
               | freely or not, that would be sharing child pornography.
        
               | cturhan wrote:
               | They can reveal the algorithm but this time abusers can
               | find a way to change the information of an image to
               | bypass the algorithm.
               | 
               | The most safe step is not to start this in the beginning
               | because it will create more problems than it solve.
        
         | ksec wrote:
         | Tim Cook doesn't Lie. I think he convinced himself what he said
         | wasn't lying. That Apple and himself are so righteous. Which is
         | actually worst, because that mentality filters through from Top
         | to Bottom. And it is showing in their marketing and PR
         | messages. He is also doing exactly Steve Jobs's last advice to
         | him, Do the right thing. Except "the right thing" is so
         | ambiguous it may turn out to be one of the worst advice.
         | 
         | My biggest turning point was Tim Cook flat out lying on the
         | Apple case against Qualcomm. Double Dipping? Qualcomm patents
         | being more than double than all the other six combined? And the
         | tactics they used in court which was vastly different to Apple
         | vs Samsung's case. And yes, they lost. ( Or settled )
         | 
         | That is the same with _privacy_. They simplifies their PR
         | message as tracking = evil. Tracking is invading your privacy.
         | Which is all good. But at the same time Apple is tracking you,
         | everything you do on Apple Music, Apple TV+, App Store and even
         | Apple Card. ( They only promise not to sell your Data to third
         | party, they still have _some_ of those Data. ). What that means
         | is that only Apple is allowed to track you, but anyone else
         | doing it are against privacy? What Apple really meant by the
         | word Privacy then is that Data should not be sold to third
         | parties. But no, they intentionally keep it unclear and created
         | a war on Data Collection while they are doing it. And you now
         | have people flat out claiming Apple doesn 't collect any Data.
         | 
         | Then there is a war on Ads. Which was so bad the Ad industry
         | pushes back and Tim Cook had to issue a mild statement saying
         | they are not against Ads, only targeted Ads. What?
         | 
         | Once you start questioning all of his motives, and find
         | concrete evidence that he is lying, along with all the facts
         | from court case of how Apple has long term plans to destroy
         | other companies, they all line up and shape how you view Tim
         | Cook's Apple. And it isn't pretty.
         | 
         | And that is speaking from an Apple fan for longer than two
         | decade.
        
         | nonbirithm wrote:
         | What I want to know is _why_ they decided to implement this.
         | Are Apple just trying to appear virtuous and took action
         | independently? Or was this done at someone else 's request?
         | 
         | For all the rhetoric about privacy coming from Apple, I feel
         | that such an extreme measure would surely cause complaints from
         | anyone deeply invested in privacy. And maybe they're just using
         | words like "significant privacy benefits compared to previous
         | techniques" to make it sound reasonable to the average user
         | who's not that invested in privacy.
        
         | cronix wrote:
         | As soon as Cook became CEO, he let the NSA's Prism program into
         | Apple. Everything since then has been a fucking lie.
         | 
         | > Andrew Stone, who worked with Jobs for nearly 25 years, told
         | the site Cult of Mac last week that Steve Jobs resisted letting
         | Apple be part of PRISM, a surveillance program that gives the
         | NSA access to records of major Internet companies. His comments
         | come amid speculation that Jobs resisted cooperating. "Steve
         | Jobs would've rather died than give into that," Stone told the
         | site.
         | 
         | > According to leaked NSA slides about PRISM, Apple was the
         | last tech behemoth to join the secret program -- in October
         | 2012, a year after Jobs died. Apple has said that it first
         | heard about PRISM on June 6 of this year, when asked about it
         | by reporters.
         | 
         | https://www.huffpost.com/entry/apple-nsa-steve-jobs_n_346132...
         | 
         | I mean, maybe they didn't call it "PRISM" when talking about it
         | with Cook, so it could _technically_ be true that they didn 't
         | hear of PRISM until media stories. Everyone knows the spy
         | agency goes around telling all of their project code names to
         | companies they're trying to compromise. Hello, sir. We're here
         | to talk to you about our top secret surveillance program we
         | like to call PRISM where we intercept and store communications
         | of everyone. Would you like to join? MS did. So did Google.
         | Don't you want to be in our select cool club?
        
         | mtgx wrote:
         | It's all been downhill since we heard that they stopped
         | developing the e2e encrypted iCloud solution because it might
         | upset the FBI even more.
        
         | dilap wrote:
         | There was a funny, tiny thing that happened a few years back
         | that made me think Tim Cook is a liar.
         | 
         | It was back when Apple had just introduced the (now-abandoned)
         | Force Touch feature (i.e., pressure sensitive touch, since
         | abandoned, since it turns out pushing hard on an unyielding
         | surface is not very pleasant or useful).
         | 
         | To showcase the capability, Apple had updated many of its apps
         | with new force-touch features. One of which was mail: if you
         | pushed _just right_ on the subject line of a message, you 'd
         | get a tiny, unscrollable popout preview of its contents.
         | 
         | It was totally useless: it took just as much time to force
         | touch to see the preview as just normally tapping to view the
         | message, and the results were less useful. It was also fairly
         | fiddly: if you didn't press hard enough, you didn't get the
         | preview; if you pressed too hard, it would open into the full
         | email anyway.
         | 
         | So Tim Cook, demoing the feature, said a funny thing. He said,
         | "It's great, I use it all the time."
         | 
         | Which maybe, just maybe, is true, but personally I don't
         | believe, not for a second.
         | 
         | So since then, I've had Tim down in my book as basically a big
         | liar.
        
           | boardwaalk wrote:
           | If that's your bar for labeling someone a big liar, I surely
           | don't wanna know ya.
           | 
           | I actually use this feature pretty regularly in Safari, even
           | if it's a long press rather than force touch now.
        
             | dilap wrote:
             | Web is different; he was talking about email. Maybe I
             | should go watch the clip again, but I remember it as
             | seeming quite transparently untrue.
             | 
             | Anyway, it's not surprising to me that Apple is not living
             | up to its lofty rhetoric around privacy.
        
         | nerdponx wrote:
         | The cynical take is that Apple was _never_ committed to privacy
         | in and of itself, but they are commited to privacy as long as
         | it improves their competitive advantage, whether by marketing
         | or by making sure that only Apple can extract value from its
         | customers ' data.
         | 
         | Hanlon's razor does not apply to megacorporations that have
         | enormous piles of cash and employ a large number of very smart
         | people, who are either entirely unscrupulous or for whom
         | scruples are worth less than their salaries. We probably aren't
         | cynical _enough_.
         | 
         | I am not arguing that we should always assume every change is
         | always malicious towards users. But our index of suspicion
         | should be high.
        
           | withinboredom wrote:
           | I'd say you're spot on, but I can't say why.
        
           | hpen wrote:
           | I've always been convinced that Apple cared about privacy as
           | a way of competitive advantage. I don't need them to be
           | committed morally or ethically, I just need them to be
           | serious about it because I will give them my money if they
           | are.
        
             | philistine wrote:
             | Tim Cook looks like he believes in money, first and
             | foremost. Anything goes second.
        
           | duped wrote:
           | What competitive advantage does performing semantic hashing
           | of my photos for law enforcement give apple
        
         | blakeinate wrote:
         | This year I purchased my first iPhone since the 3G, after today
         | I am starting to regret that decision. At this point, I can
         | only hope Linux on mobile picks up steam.
        
         | anonuser123456 wrote:
         | Not really. This only applies to photos uploaded to iCloud. And
         | photos uploaded to iCloud (and Google drive etc.) are already
         | scanned on server for CP.
         | 
         | Apple is moving that process from on server to on phone in a
         | way that protects your privacy better than current standards.
         | 
         | In the current system, all your photos are available to Apple
         | unencrypted. In the new system, nothing will be visible to
         | apple unless you upload N images with database hits. From those
         | N tokens, Apple is then able to decrypt your content.
         | 
         | So when this feature lands, it improves your privacy relative
         | to today.
        
           | taxyovio wrote:
           | Do you have any references for your remarks on the current
           | situation where all iCloud photos are scanned on the server
           | side?
        
             | anonuser123456 wrote:
             | https://www.macobserver.com/analysis/apple-scans-uploaded-
             | co...
        
           | rantwasp wrote:
           | nah. this is not how trust works. if Apple does stuff like
           | this, I stop trusting Apple. Binary. Trust of No Trust.
           | 
           | Who is to say once they start doing this they will not extend
           | their capabilities and monitor everything on the device? This
           | is the direction we're heading in.
           | 
           | For me this is my last iphone. And probably my last mac. the
           | hardware is nice, shiny ams usable but you cannot do shit
           | like this after you sell everyone on privacy.
           | 
           | What would a company that cares about privacy do? you don't
           | scan any of my things without explaining why and getting my
           | consent. That's privacy
        
             | anonuser123456 wrote:
             | They are literally telling you what they do with full
             | disclosure, and have engineered a system to give you _more_
             | privacy than you have on any existing cloud photo provider
             | today.
             | 
             | If you don't want to use cloud photo services because you
             | don't like the implications, they are very upfront; disable
             | iCloud photos. But every major cloud photo hosting service
             | is doing this already on your images.
             | 
             | What you are missing is the behind the scenes pressure for
             | DOJ, FBI and congress. Apple is trying their best to thread
             | the needle, providing as much privacy as possible while
             | plausibly covering their bases so congress won't pass
             | onerous and privacy stripping laws.
        
               | rantwasp wrote:
               | > Apple is trying their best to thread the needle,
               | providing as much privacy as possible while plausibly
               | covering their bases so congress won't pass onerous and
               | privacy stripping laws.
               | 
               | There is the problem right there. How can you tell that
               | Apple is doing their best to preserve privacy vs doing
               | their best to server their own interests?
               | 
               | If congress passes laws, those laws are in the open and
               | we have a whole process dedicated to passing new laws.
               | Here is something you maybe did not consider: passing
               | those laws would be hard impossible without basically
               | sacrificing any political capital the washed up political
               | class still has in today's US (have you noticed how even
               | the smallest issues is a big political issue and how
               | congress doesn't seem to get much done nowadays?)
               | 
               | The 2nd part is that not the DOJ, FBI, NSA doing their
               | job is the problem. The problem is mass surveillance. The
               | same mass surveillance we have been subjected to keeps
               | getting expanded to the point you will no longer be able
               | to think anything else except what's approved by the
               | people in power (if you think the thought police is a
               | ridiculous concept, we are definitely heading that way).
               | 
               | Apple's shtick was that they cared about privacy. If they
               | didn't do that dog and pony show maybe I would have
               | written this off as "corporations being corporations".
               | Now they don't get that.
               | 
               | > If you don't want to use cloud photo services because
               | you don't like the implications, they are very upfront;
               | disable iCloud photos. You don't seem to get it. It's not
               | about some fucking photos. Who cares. It's about what
               | this open the door to. And about how this is going to be
               | abused and expanded by "law enforcement".
               | 
               | In a world where you have shit like Pegasus and mass
               | surveillance, you are one API call away from going to
               | jail without even knowing what triggered it and being
               | able to defend yourself. Probable cause? Fuck that.
               | Everyone is guilty until proven innocent.
        
               | arvinsim wrote:
               | Then why not just scan the photos in the cloud? Why in
               | device? Looks like a big, red flag.
        
               | anonuser123456 wrote:
               | They already scan photos in iCloud and report
               | exploitative material to LE.
        
               | rantwasp wrote:
               | so why scan it on MY device? Is there more to this story
               | and Apple is not telling us what is really going on. you
               | betcha'
        
         | robertoandred wrote:
         | Except the hashing and hash comparison are happening on the
         | device itself.
        
           | zionic wrote:
           | That's even worse
        
             | robertoandred wrote:
             | Why?
        
               | pzo wrote:
               | Because as other mentioned before its just few lines of
               | code away to allow scanning any pictures that are not
               | synced to any cloud. This just puts foundation for cheap,
               | automatic mass surveillance.
               | 
               | Imagine if you were hired to design mass surveillance
               | system. What kind of technical problems you would have to
               | face?: 1) Trillions of pictures of altogether huge MB of
               | data would have to be send to your servers to analyze -
               | this creates huge network badwidth 2) You would need huge
               | computing power to analyze those 3) This would probably
               | easy to be detected
               | 
               | Apple solution would allow: 1) plausible deniability - it
               | 'was just a bug that scanned all pictures instead of
               | those that supposed to be in icloud' 2) cheap - using
               | user cpu/gpu and smaller user bandwidth just to send
               | hashes 3) less suspicions than having completely unknown
               | process/daemon in the background because rolled as part
               | of 'protect the children' campain 4) rolled out to one of
               | the most popular mobile phone in US that has locked
               | bootloader and OS cannot be downgraded, etc.
        
               | cturhan wrote:
               | Great explanation. It's Apple way of decentralized image
               | analysis with very little server cost and this is the
               | most amazing idea I've seen for mass surveillance
        
         | samstave wrote:
         | Don't worry - you can trust ALL of these guys:
         | 
         | https://i.imgur.com/z3JeRgk.jpg
        
         | dylan604 wrote:
         | It is secure, as long as you have nothing to hide. If you have
         | no offending photos, then the data won't be uploaded! See, it's
         | not nefarious at all! /s
        
         | JohnFen wrote:
         | > because the CEO seemed to be sincere in his commitment to
         | privacy.
         | 
         | The sincerity of a company officer, even the CEO, should not
         | factor into your assessment. Officers change over time (and
         | individuals can change their stance over time), after all.
        
         | BiteCode_dev wrote:
         | So you mean the company that was part of PRISM, that has unfair
         | business practices and a bully as a founder was not really the
         | world savior their marketting speach said they were ?
         | 
         | I'm in shock. Multi-billion dollars company usually never lies
         | to make money! And power grabbing entities have such a neat
         | track record in human history.
         | 
         | Not to mention nobody saw that coming and told repeatadly one
         | should not get locked into such a closed and proprietary
         | ecosystem in the first place.
         | 
         | I mean, dang, this serial killer was such a nice guy. The dead
         | babies in the basements were weird but appart from that he was
         | a stellar neighbour.
        
       | DaveSchmindel wrote:
       | (1) I'm a bit frustrated, as a true Apple "bitch", at the irony
       | here. As a loyal consumer, I am (likely) never going to be
       | privileged enough to know exactly which part of Apple's budget
       | allowed for this implementation to occur. I can only assume that
       | such data would speak volumes as to _why_ the decision to
       | introduce CSAM this way has come to light.
       | 
       | (2) I'm equally intrigued by the paradox that in order for the
       | algorithms that perform the CSAM detection to work, it must
       | require some data set that represents these reprehensible images
       | (which are illegal to possess).
        
       | unstatusthequo wrote:
       | 4th Amendment. Plaintiff lawyers gear up.
        
       | dep_b wrote:
       | So we have a person that is technical enough to find known CP, so
       | the stuff that's already automatically filtered out by Google and
       | co because those same hashes are already checked against for all
       | images they index. So knowledge of dark web should be assumed,
       | something I don't even know how to use let alone how find the
       | filth on there.
       | 
       | Yet....dumb enough to upload it unencrypted to iCloud instead of
       | storing it in a strongly encrypted folder on their PC?
       | 
       | The two circles in this diagram have a very thin overlap I think.
       | 
       | Dumb move by Apple, privacy is either 100% private or not
       | private.
       | 
       | Unless somebody can enlighten me that like 23% of all
       | investigated pedophiles that had an iPhone seized had unencrypted
       | CP on their iCloud accounts? I am willing to be proven wrong
       | here.
        
       | skee_0x4459 wrote:
       | wow. in the middle of reading that, i realized that this is a
       | watershed moment. why would apple go back on their painstakingly
       | crafted image and reputation of being staunchly pro privacy? its
       | not for the sake of the children (lol). no, something happened
       | that has changed the equation for apple. some kind of decisive
       | shift has occurred. maybe apple has finally caved in to the
       | chinese market, like everyone else in the US, and is now making
       | their devices compatible with chinese surveillance. or maybe the
       | US government has finally managed to force apple to crack open
       | its shell of encryption in the name of a western flavored
       | surveillance. but either way, i think it is a watershed moment
       | because securing privacy will from this moment onward be a fringe
       | occupation in the west. unless a competitor rises up -- but thats
       | impossible because there arent enough people who care about
       | privacy to sustain a privacy company. thats the real reason why
       | privacy has died today.
       | 
       | if you really want to save the children, why not build the
       | scanning into safari? scan the whole phone! just scan it all. its
       | really no different than what they are doing. its not like they
       | would have to cross the rubicon to do it, not anymore anyway.
       | 
       | and also i think its interesting how kids will adjust to this. i
       | think a lot of kids wont hear about this and will find themselves
       | caught up in a child porn case.
       | 
       | im so proud of the responses that people seem to generally have.
       | it makes me feel confident in the future of the world.
       | 
       | isnt there some device to encrypt and decrypt messages with a
       | separate device that couples to your phone? like a device fit
       | into a case and that has a keyboard interface built into a screen
       | protector with indium oxide electrodes.
        
         | zionic wrote:
         | You can't "save the children" by building a dystopia for them
         | to grow up in.
        
         | divbzero wrote:
         | A sibling comment speculates that this is related to Pegasus
         | [1] which sounds wild to me but maybe, just maybe, it's not.
         | 
         | [1]: https://news.ycombinator.com/item?id=28080539
        
           | judge2020 wrote:
           | Conspiracy theories should be considered, just don't endure
           | consequences as a result of blind belief in them.
        
       | joering2 wrote:
       | > This means that when the features are rolled out, a version of
       | the NCMEC CSAM database will be uploaded onto every single
       | iPhone.
       | 
       | Question - if most people literally don't want to have anything
       | to do with CP, isn't uploading of a hash database of that
       | material to their phones precisely that?
       | 
       | For once I think I will feel disgusted walking around with my
       | phone in a pocket; a phone that is full of hashes of child porn.
       | That's a terrible feeling.
        
       | cblconfederate wrote:
       | Makes you rally for NAMBLA
        
       | voidmain wrote:
       | This seems less concerning than the fact that iCloud backup is
       | not end-to-end encrypted in the first place.
        
       | new_realist wrote:
       | Moral panics are nothing new, and have now graduated into the
       | digital age. The last big one I remember was passage of the DMCA
       | in 1999; it was just absolutely guaranteed to kill the Internet!
       | And as per usual, the Chicken Littles the world were proven
       | wrong. The sky will not fall in this case, either. Unfortunately
       | civilization has produced such abundance and free time that
       | outage viruses like this one will always circulate.
        
       | robertwt7 wrote:
       | When I thought that Tim Cook really respected everyone's privacy
       | sincerely.
       | 
       | Apparently I was wrong, I loved apple products and ecosystem. Not
       | sure what to switch after this :/
        
         | system2 wrote:
         | I wonder if Nokia rises from its ashes after this.
        
           | wetpaws wrote:
           | Nokia phones are not a Nokia brand anymore, so I doubt it
        
       | kntoukakis wrote:
       | From https://www.apple.com/child-safety/
       | 
       | "The threshold is set to provide an extremely high level of
       | accuracy and ensures less than a one in one trillion chance per
       | year of incorrectly flagging a given account."
       | 
       | How did they calculate this? Also, I can imagine more than a
       | trillion photos being uploaded to iCloud a year.
        
       | dukeofdoom wrote:
       | Technocrats are the new railway tycoons
        
       | new_realist wrote:
       | Studies have shown that CCTV reduces crime (https://whatworks.col
       | lege.police.uk/toolkit/Pages/Interventi...). I expect results
       | here will be even better.
       | 
       | This technology uses secret sharing to ensure a threshold of
       | images are met before photos are flagged. In this case, it's even
       | more private than CCTV.
       | 
       | Totalitarian regimes to do not need some magic bit of technology
       | to abuse citizens; that's been clear since the dawn of time.
       | Those who are concerned about abuse would do well to direct their
       | efforts towards maintenance of democratic systems: upholding
       | societal, political, regulatory and legal checks and balances.
       | 
       | Criminals are becoming better criminals by taking advantage of
       | advancements in technology right now, and, for better or worse,
       | it's an arms race and society will simply not accept criminals
       | gaining the upper hand.
       | 
       | If not proven necessary, society is capable of reverting to prior
       | standards (Habeas Corpus resumed after the Civil War, and parts
       | of the Patriot Act have expired, for example.).
        
         | kappuchino wrote:
         | You link to an article that says ... "Overall, the evidence
         | suggests that CCTV c an reduce crime.". And then continues
         | mention that specific context matters: Vehicle crime ... oh
         | well, I wonder if we could combat that without surveilance,
         | like better locks, remote disable of the engine ... There as
         | here with the phones, society has to evaluate the price of the
         | loss of privacy and abuse by totalitarien systems, which will
         | happen - we just can't say when. This is why some - like me -
         | resist backdoors at all if for the price of "more crime".
        
       | mrwww wrote:
       | So if your Apple ID/icloud gets compromised, and somebody save an
       | album of CP to your icloud photos, it is then only a question of
       | time until the police comes knocking?
        
       | RightTail wrote:
       | This is going to be used to suppress political dissidents aka
       | "populist/nationalist right" aka the new alqaeda
       | 
       | searching for CP is the original pretext
        
         | anthk wrote:
         | More like the reverse, fool. The power loves right wing people
         | and racists.
         | 
         | If anything, the left and progressive left will be prosecuted.
         | 
         | China? They even attacked Marxist demonstrations in
         | universities. Current ideology in China is just Jingoism or
         | "keep shit working no matter how".
        
         | robertoandred wrote:
         | How? Please be specific.
        
           | gruez wrote:
           | Presumably by adding signatures for "populist/nationalist
           | right" memes.
        
       | [deleted]
        
       | villgax wrote:
       | The impact a false positive can have on relations between parents
       | & friends of the family is huge for something banal as an art
       | poster/music cover art
        
       | volta83 wrote:
       | So Apple is putting a database of child pornography on my phone ?
       | 
       | I'd rather not have that on my phone.
        
       | XorNot wrote:
       | Various copyright enforcement lobbies are all furiously drafting
       | letters right now.
        
       | [deleted]
        
       | alisonkisk wrote:
       | OP completely misunderstands the situation.
       | 
       | > OS and iPadOS will use new applications of cryptography to help
       | limit the spread of CSAM online, while designing for user
       | privacy. CSAM detection will help Apple provide valuable
       | information to law enforcement on collections of CSAM in iCloud
       | Photos.
       | 
       | WhatsApp is not a hosting service.
        
       | iamleppert wrote:
       | It's pretty trivial to iteratively construct an image that has
       | the same hash as another, completely different image if you know
       | what the hash should be.
       | 
       | All one needs to do, in order to flag someone or get them caught
       | up in this system, is to gain access to this list of hashes and
       | construct an image. This data is likely to be sought after as
       | soon as this system is implemented, and it will only be a matter
       | of time before a data breach exposes it.
       | 
       | Once that is done, the original premise and security model of the
       | system will be completely eroded.
       | 
       | That said, if this does get implemented I will be getting rid of
       | all my Apple devices. I've already switched to Linux on my
       | development laptops. The older I get, the less value Apple
       | products have to me. So it won't be a big deal for me to cut them
       | out completely.
        
         | jjtheblunt wrote:
         | Cryptographic hashes are exactly not trivial to "dupe".
         | 
         | https://en.wikipedia.org/wiki/Cryptographic_hash_function
         | 
         | that said, it's not clear to me from
         | 
         | https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...
         | 
         | how collision resistant what's to be used will be.
        
           | pseudalopex wrote:
           | Perceptual hashes aren't cryptographic.
        
             | handoflixue wrote:
             | Is there anything stopping them from using an actual
             | cryptographic hash, though?
        
               | layoutIfNeeded wrote:
               | Ummm... the fact that changing a single pixel will let
               | the baddies evade detection?
        
               | pseudalopex wrote:
               | Even the smallest change to an image changes a
               | cryptographic hash.
        
               | heavyset_go wrote:
               | Yes, using cryptographic hashes prevents them from fuzzy
               | matching and the algorithms that allow for efficient
               | matching and fuzzy matching require comparing the bare
               | hashes.
        
           | kickopotomus wrote:
           | They are not using cryptographic hashes. They are using
           | perceptual hashes[1] which are fairly trivial to replicate.
           | 
           | [1]: https://en.wikipedia.org/wiki/Perceptual_hashing
        
             | tcoff91 wrote:
             | This seems dumb. I'm sure that sophisticated bad people
             | will just alter colors and things to defeat the hashes and
             | meanwhile trolls will generate collisions to cause people
             | to falsely be flagged.
        
               | judge2020 wrote:
               | The perceptual hashes are specifically designed to
               | correlate images that are visually similar while not
               | being exactly alike, even when the colors are altered.
               | See:
               | 
               | https://blog.cloudflare.com/the-csam-scanning-tool/
        
               | [deleted]
        
         | tcoff91 wrote:
         | what is the hashing scheme? I assume it must not be a
         | cryptographically secure hashing scheme if it's possible to
         | find a collision. It's not something like sha256?
        
           | cyral wrote:
           | They call it NeuralHash, there is a lengthy technical spec
           | and security analysis in their announcement
        
       | swiley wrote:
       | I'm really worried about everyone. Somehow I've missed this until
       | now and I've felt sick all day since hearing about it.
        
       | andrewmcwatters wrote:
       | I suspect Apple is subject to government and gag orders and
       | Microsoft has already been doing this with OneDrive but no one
       | has heard about it yet.
        
         | CubsFan1060 wrote:
         | It it literally in the Wikipedia article
         | https://en.wikipedia.org/wiki/PhotoDNA
         | 
         | " It is used on Microsoft's own services including Bing and
         | OneDrive,[4] as well as by Google's Gmail, Twitter,[5]
         | Facebook,[6] Adobe Systems,[7] Reddit,[8] Discord[9] and the
         | NCMEC,[10] to whom Microsoft donated the technology."
        
           | andrewmcwatters wrote:
           | Oh, I forgot about this. IIRC, Bing also flags you for search
           | queries for illegal content.
        
       | wellthisisgreat wrote:
       | Apple's parental controls are HORRIBLE. There is at least 20%
       | false positives there, that flag all sorts of absolutely benign
       | sites as "adult".
       | 
       | Any kind of machine-based contextual analysis of users' content
       | will be a disaster.
        
         | robertoandred wrote:
         | Good news! It's not doing contextual analysis of content. It's
         | comparing image hashes.
        
           | wellthisisgreat wrote:
           | Could you explain please - can these hash comparisons be
           | extended to other areas such as contextual analysis of photos
           | or texts?
           | 
           | For example would it be easy now to get to the hypothetical
           | scenario where texts containing certain phrases will be
           | flagged if some partner / regulator demands that?
           | 
           | Or doing face recognition on images, etc.? Or is this still
           | completely different from that
        
           | wellthisisgreat wrote:
           | oh that's actually kind of good news then. I couldn't believe
           | Apple wouldn't know about the inadequacy of their PC
        
           | pseudalopex wrote:
           | You confused the 2 new features. The child pornography
           | detector compares perceptual hashes. The iMessage filter
           | tries to classify sexually explicit images.
        
       | _carl_j_b_223 wrote:
       | Does Apple really think those bastards share their disgusting
       | content via iCloud or message themself via iMessage? Even if some
       | idiots did, they'll stop by now. So even if Apple has pure good
       | intentions it'll be pretty useless and so Apple don't even have
       | to start with these kind of questionable practices.
        
       | hamburgerwah wrote:
       | It will take a matter of days for other parties including
       | copyright holders, if they have not already, to get in on this
       | action. The infrastructure will then be compromised by human int
       | so that it can be used to intelligence agencies to find people
       | hitting red flag words like snowden and wikileaks. But lets be
       | real for a moment that anyone who thinks apple cares about
       | security or privacy over profits is in some way kidding
       | themselves.
        
       | Grustaf wrote:
       | The articles I've read say:
       | 
       | _Hashes_ of photos will be scanned for _known_ abusive material,
       | client side.
       | 
       | So the only thing Apple can find out about you is if you have
       | some of these known and catalogued images. They will definitely
       | not know if you have other nude photos, including of your
       | children.
       | 
       | The other, separate feature is a parental control feature. You as
       | a parent can be told if your children send or receive nude
       | photos. This obviously sacrifices some of their privacy, but that
       | is what parenting is. It's not more intrusive than screentime, or
       | any number of things you might do as a parent to make sure your
       | children are safe..
        
         | zekica wrote:
         | These are not cryptographic hashes you are thinking of but
         | perceptual hashes for which collisions are much easier to find.
        
       | Waterluvian wrote:
       | If I go on 4chan and an illegal image loads and caches into my
       | phone before moderators take it down or I hit the back button,
       | will Apple's automated system ruin my life?
       | 
       | This kind of stuff absolutely petrifies me because I'm so scared
       | of getting accidentally scooped up for something completely
       | unintentional. And I do not trust police one bit to behave like
       | intelligent adult humans.
       | 
       | Right now I feel like I need to stop doing ANYTHING that goes
       | anywhere outside the velvet ropes of the modern commercial
       | internet. That is, anywhere that cannot pay to moderate
       | everything well enough that I don't run the risk of having my
       | entire life ruined because some #%^*ing algorithm picks up on
       | some content I didn't even choose to download.
        
         | benzoate wrote:
         | > If I go on 4chan and an illegal image loads and caches into
         | my phone before moderators take it down or I hit the back
         | button, will Apple's automated system ruin my life?
         | 
         | No, only if you save multiple CSAM images to your photo library
         | and have iCloud Photo Library turned on.
        
           | Waterluvian wrote:
           | Okay. For now. But it's a trivial change to begin reading my
           | cache.
           | 
           | The technical details and current policies don't really
           | matter. The bigger picture is terrifying.
           | 
           | Sex crime accusations, even if nobody is convicted,
           | completely ruins lives. I don't want a robot on my phone
           | doing that.
        
           | pzo wrote:
           | for 4chan maybe that's true but I'm not sure what about some
           | public whatsapp group? I have been part of few public
           | hiking/travelling group and even though I have most of them
           | muted (to avoid distraction) all pictures end up in my Photos
           | 'Recent' Album.
        
             | cruano wrote:
             | You can turn that off in Settings -> Chats -> Save to
             | Camera Roll
        
       | 14 wrote:
       | Will the jailbreakers be able to disable this feature?
        
         | system2 wrote:
         | What's the percentage of jailbroken iPhones out there? Average
         | joe will be sheepishly keep using.
        
       | [deleted]
        
       | threatofrain wrote:
       | Recent relevant discussion.
       | 
       | https://news.ycombinator.com/item?id=28068741
       | 
       | https://news.ycombinator.com/item?id=28075021
       | 
       | https://news.ycombinator.com/item?id=28078115
        
         | dang wrote:
         | Thanks! Macroexpanded:
         | 
         |  _Expanded Protections for Children_ -
         | https://news.ycombinator.com/item?id=28078115 - Aug 2021 (291
         | comments)
         | 
         |  _Apple plans to scan US iPhones for child abuse imagery_ -
         | https://news.ycombinator.com/item?id=28075021 - Aug 2021 (349
         | comments)
         | 
         |  _Apple enabling client-side CSAM scanning on iPhone tomorrow_
         | - https://news.ycombinator.com/item?id=28068741 - Aug 2021 (680
         | comments)
        
       | new_realist wrote:
       | Moral panics are nothing new, and have now graduated into the
       | digital age. The last big one I remember was passage of the DMCA
       | in 1999; it was just absolutely guaranteed to kill the Internet!
       | And as per usual, the Chicken Littles the world were proven
       | wrong. The sky will not fall in this case, either. Unfortunately
       | civilization has produced such abundance and free time that
       | outage viruses like this one will always circulate. Humans need
       | something to spend their energy on.
        
         | nopeYouAreWrong wrote:
         | uhhh....dmca has been a cancer and destroyed people...so...the
         | fears werent exactly unfounded
        
       | lenkite wrote:
       | Can the legions of Apple Apologists on this forum at-least agree
       | that all the talk about how well the iPhone supports individual
       | privacy is just a bunch of bald-faced lies ?
       | 
       | I mean they use the privacy argument to avoid side-loading apps,
       | lol. But scanning your photos is OK.
       | 
       | What absolute hypocrisy.
        
       | dalbasal wrote:
       | _" "Apple sells iPhones without FaceTime in Saudi Arabia, because
       | local regulation prohibits encrypted phone calls. That's just one
       | example of many where Apple's bent to local pressure. What
       | happens when local regulations in Saudi Arabia mandate that
       | messages be scanned not for child sexual abuse, but for
       | homosexuality or for offenses against the monarchy?""_
       | 
       | Good question. Companies have to follow laws. The naive, early
       | 2000s notion that the internet was unstoppable and ungovernable
       | was mistaken. Apple, Google and the other internet bottlenecks
       | were, it turned out, the pathway to a governable internet. That
       | fight is lost.
       | 
       | Now that it's governable, attention needs to be on those
       | governing... governments, parliaments, etc.
       | 
       | The old version of freedom of speech and such didn't come from
       | the divine. They were created and codified and now we have them.
       | We need to do that again. Declare new, big, hairy freedoms that
       | come with a cost that we have agreed to pay.
       | 
       | There are dichotomies here, and if we deal with them one droplet
       | at a time, they'll be compromised away. "Keep your private
       | messages private" and "Prevent child pornography and terrorism in
       | private messages" are incompatible. But, no one is going to admit
       | that they are choosing between them... not unless there's an
       | absolut-ish principle to defer to.
       | 
       | Once you're scanning email for ad targeting, it's hard to justify
       | not scanning it for child abuse.
        
       | kevin_thibedeau wrote:
       | It would be a shame if we had to start an investigation into your
       | anti-competitive behavior...
        
       | superkuh wrote:
       | I guess Apple has given up on Apple Pay and becoming a bank.
       | Without that as motivation for security this is probably the
       | first of many compromises to come.
        
       | stereoradonc wrote:
       | The privacy creep usually happens by building narratives around
       | CSAM. Yes, agreed it was objectionable, but there was no
       | "scientific analysis" that such measures would prevent
       | dissemination in the first place. Surveillance is morally
       | discreditable, and Apple seems to have tested the waters well -
       | by building a privacy narrative and then screwing the users in
       | the process. Most users believe it is "good for them". Though, it
       | remains the most restrictive system.
        
       | egotripper wrote:
       | Who ordered Apple to do this, "or else?" What was the "or else?"
       | How easy will it be to expand this capability by Apple or anyone
       | outside of Apple?
       | 
       | I expect that any time you take a photo, the scan will be
       | performed right away, and the results file will be waiting to be
       | sent the next time you enable voice and data.
       | 
       | This capability crushes the trustworthiness of the devices.
        
       | klempotres wrote:
       | Technically speaking, if Apple plans to perform PSI on device (as
       | opposed to what Microsoft does), how come that "the device will
       | not know whether a match has been found"?
       | 
       | Is there anyone who's familiar with the technology so they can
       | explain how it works?
        
         | gruez wrote:
         | >how come that "the device will not know whether a match has
         | been found"
         | 
         | Probably using some sort of probabilistic query like a bloom
         | filter.
        
           | klempotres wrote:
           | But the claim is that Apple does that "on device". To the
           | best of my understanding, this would mean that both parties
           | in the PSI protocol are "on the same device". Do they
           | probably use some kind of TEE (Trusted Execution Environment)
           | to evaluate the "other side" of the PSI protocol?
        
       | suizi wrote:
       | https://news.ycombinator.com/item?id=28081184 The NCMEC already
       | had it's problems. But, this takes it to a whole new level.
        
       | throw7 wrote:
       | The question that should be asked is if you think it's ok if the
       | U.S. gov't looks at every picture you take and have taken and
       | store and will store. The U.S. gov't will access, store, and
       | track that information on you for your whole life. Past pictures.
       | Present pictures. Future pictures.
       | 
       | I don't use apple products, but if I found out google was
       | scanning my photos on photos.google.com on behalf of the
       | government I would drop them. I'm not saying it wouldn't hurt,
       | because it definitely would, but in a capitalistic country this
       | is the only way to fight back.
        
       | aetherspawn wrote:
       | Yeah, sure. I'm happy to be downvoted to hell, but I know people
       | who would have benefit greatly from this (perhaps have entirely
       | different lives) if it were implemented 10 years ago.
       | 
       | Convince me that a strong step to ending CSA at the expense of a
       | little privacy is a bad thing.
        
         | suizi wrote:
         | I seriously doubt the majority of cases are recorded and
         | uploaded to the internet.
        
       | alana314 wrote:
       | I thought Apple's iMessage wasn't end-to-end anyway but instead
       | used a key stored on Apple's servers?
        
       | RedComet wrote:
       | It won't be long before this is turned on political dissidents.
       | 
       | * knock knock * "we received an anonymous report that you have
       | hate speech an illegal meme on your phone, please come with us"
        
       | tlogan wrote:
       | Oh well... it always starts with "protect the children". Then
       | "protect us from terrorists", then "terrorist sympathizers", ...
       | 
       | And I bet that Saudis and other oppressive regimes will use this
       | to detect other "crimes".
        
       | nullc wrote:
       | Your smartphone or desktop computer is your agent. You can't
       | accomplish many necessary tasks without it, you're nearly
       | required by law to use one. It handles your most private data,
       | and yet you have no real visibility into its actions. You just
       | have to trust it.
       | 
       | As such, it should NEVER do anything that isn't in your best
       | interest-- to the greatest extent possible under the law. Your
       | relationship with your personal computer is closer and more
       | trusted than your relationship with your doctor or lawyer-- in
       | fact, you often communicate with these parties via your computer.
       | 
       | We respect the confidentiality you enjoy with your professional
       | agents but that confidentiality cannot functionally exist if your
       | computing devices are not equally duty bound to act in their
       | users best interest!
       | 
       | This snitching 'feature' is a fairly general purpose
       | tracing/tracking mechanism-- We are to assume that the perceptual
       | hashes are exclusively of unlawful images (though I can't
       | actually find a firm, binding assertion of that!)-- but there is
       | nothing assuring that to us except for blind trust.
       | 
       | Even if the list today exclusively has unlawful images there is
       | no guarantee that tomorrow it won't have something different-- no
       | guarantee that some hysterical political expediency won't put
       | images associated with your (non-)religion or ethnicity into it,
       | no guarantee that the facility serving these lists won't be
       | hacked or abused by insiders. Considering that possession of
       | child porn is a strict liability crime, Apple themselves has
       | presumably not validated the content of the list themselves and
       | certainly you won't be allowed to check it. Moreover, even if
       | there were some independent vetting of the list content there is
       | nothing that would prevent targeted parties from being given a
       | _different_ unvetted list without their knowledge.
       | 
       | The pervasive scanning can also be expected to dramatically
       | increases the effectiveness of framing. It's kind of cliche that
       | the guilty person often claims "I was framed"-- but part of the
       | reason that framing is rare is because the false evidence has to
       | intersect a credibly motivated investigation, and they seldom do
       | except where there are other indicators of guilt. With automated
       | scanning it would be much more reliable to cause someone a world
       | of trouble by slipping some indicated material on their device,
       | and so framing would have a much better cost/benefit trade-off.
       | 
       | Any of the above flaws are sufficiently fatal on their own-- but
       | add to it the potential for inadvertent false positives both in
       | the hash matching and in the construction of the lists. Worse,
       | it'll probably be argued that the detailed operation of the
       | system must be kept secret from the very users whos systems it
       | runs on specifically because knowledge of the operation would
       | greatly simplify the malicious construction of intentional false
       | positives which could be used for harassment by causing spurious
       | investigations.
       | 
       | In my view Apple's actions here aren't just inappropriate,
       | they're unambiguously unethical and in a more thoughtful world
       | they'd be a violation of the law.
        
       | stakkur wrote:
       | Imagine if the government said they were installing a backdoor in
       | your checking account to 'anonymously' analyze your expenses and
       | payees, 'just to check for known illegal activity'. Every time
       | you use your debit card or pay a bill, the government analyzes it
       | to see if it's 'safe'.
        
         | system2 wrote:
         | Everyone knows CC transactions are completely shared with
         | government, IRS, bank, credit score companies, etc. Not even
         | close to what's being done here.
        
       | c7DJTLrn wrote:
       | Catching child pornographers should not involve subjecting
       | innocent people to scans and searches. Frankly, I don't care if
       | this "CSAM" system is effective - I paid for the phone, it should
       | operate for ME, not for the government or law enforcement.
       | Besides, the imagery already exists by the time it's been found -
       | the damage has been done. I'd say the authorities should
       | prioritise tracking down the creators but I'm sure their
       | statistics look much more impressive by cracking down on small
       | fry.
       | 
       | I've had enough of the "think of the children" arguments.
        
         | burself wrote:
         | The algorithms and data involved are too sensitive to be
         | discussed publicly and the reasoning is acceptable enough to
         | even the most knowledgeable people. They can't even be
         | pressured to prove that the system is effective at it's primary
         | purpose.
         | 
         | This is the perfect way to begin opening the backend doors.
        
           | kccqzy wrote:
           | The algorithm is actually public:
           | https://www.apple.com/child-
           | safety/pdf/Apple_PSI_System_Secu... From an intellectual
           | point of view it's interesting to learn about.
           | 
           | I agree with the rest of your points. The problem is that we
           | don't know if Apple implemented this algorithm correctly, or
           | even this algorithm at all, because the source code isn't
           | subject to review and, even it were the binary cannot be
           | proved to have been built from such source code. We also
           | don't have proof that the only images being searched for are
           | child abuse images as they claim.
        
           | suizi wrote:
           | Security by obscurity has never been particularly effective,
           | and there are some articles which allege that detection
           | algorithms can be defeated fairly easily.
        
         | bambax wrote:
         | Yes. I'm not interested in catching pedophiles, or drug
         | dealers, or terrorists. It's the job of the police. I'm not the
         | police.
        
           | adolph wrote:
           | Yes, if you act as the police you are a vigilante.
        
         | 2OEH8eoCRo0 wrote:
         | Why is it always "think of the children"? It gets people
         | emotional? What about terrorism, murder, or a litany of other
         | heinous violent crimes?
        
           | suizi wrote:
           | France has been pushing terrorism as a justification for
           | mass-surveillance in the E.U.
        
           | [deleted]
        
           | falcolas wrote:
           | I invite you to look up "The Four Horsemen of the
           | Infocalypse". Child Pornography is but one of the well
           | trodden paths to remove privacy and security.
        
             | kazinator wrote:
             | And remember, a minor who takes pictures of him or herself
             | is an offender.
        
               | fortran77 wrote:
               | As it has to be. Because there's no defense against the
               | possession of it, you don't want a situation where a
               | person under 18 can take pictures of him or herself, send
               | it to an adult unsolicited, and then call the police and
               | not suffer any consequences.
        
               | chki wrote:
               | That doesn't make any sense and it's not how it works in
               | a number of other countries. You could (for example) make
               | it illegal to send these pictures instead.
        
               | fortran77 wrote:
               | That's how it works in the U.S. It's called "strict
               | liability."
        
               | kazinator wrote:
               | People own their bodies. Taking pictures of yourself, if
               | you're a child, isn't child porn any more than touching
               | yourself is molestation/assault.
               | 
               | Children don't need to be hit with "strict liability".
               | 
               | A person trying to frame someone else of a serious crime
               | commits a serious offense, yes.
               | 
               | But that's a logically separate concept from the
               | production or possession of child pornography, which that
               | person must not be regarded as committing if the images
               | are of him or herself.
               | 
               | The idiotic law potentially victimizes victims. A
               | perpetrator can threaten the child into denying the
               | existence of the perpetrator, and into falsely admitting
               | to having taken pictures him or herself. It's exactly
               | like taking the victims of human trafficking and charging
               | them with prostitution, because the existence and
               | whereabouts of the traffickers couldn't be established.
               | 
               | Whoever came up with this nonsense was blinded by their
               | Bible Belt morality into not seeing the unintended
               | consequences.
        
               | [deleted]
        
           | vineyardmike wrote:
           | "CSAM" is an easy target because people can't _see_ it - it
           | would be wrong for you to audit the db because then you 'd
           | need the illicit content. So its invisible to the average
           | law-abiders.
        
         | [deleted]
        
         | tamrix wrote:
         | You know it can be used to get the geo location in the meta
         | data of pictures for people who took photos at protests. Etc
        
         | anthk wrote:
         | > the damage has been done. I'd say the authorities should
         | prioritise tracking down the creators
         | 
         | Russian and ex-Soviet countries with human trafficking mafias
         | host several fucked up people who produce this crap.
        
         | NoPicklez wrote:
         | I do agree with your points, but I think it's obvious to see
         | that this feature is trying to allow authorities to catch the
         | creators.
        
           | [deleted]
        
         | zionic wrote:
         | I'm furious. My top app has 250,000 uniques a day.
         | 
         | I'm considering a 24h black out with a protest link to apple's
         | support email explaining what they've done.
         | 
         | I wonder if anyone else would join me?
        
           | collaborative wrote:
           | We need to get organized first. We need a support platform
           | where we can coordinate these type of actions. It's in my
           | todo list, but if anyone can get this started please do so
        
         | mrits wrote:
         | There isn't any reason to believe the CSAM hash list is only
         | images. The government now has the ability to search for
         | anything in your iCloud account with this.
        
         | samstave wrote:
         | Here is my main takeaway from this:
         | 
         | FB has an ENTIRE FUCKING DEPARTMENT dedicated to scanning and
         | monitoring and ostensibly reporting all sorts of abhorrant and
         | depraved activities posted to FB...
         | 
         | Have you heard of ONE major sting/arrest round up or ANYTHING
         | coming from them?
         | 
         | NOPE.
         | 
         | FFS - I used to commute to FB with one of the guys on that team
         | and he wasn't supposed to talk about it - but he mentioned some
         | of the HORRIFIC things they had to deal with.
         | 
         | Still -- Not a SINGLE major "crackdown" or "cleanup" or
         | "roundup" from anything that FB has done and according to them
         | they have half the fucking planet as a user-base.
         | 
         | ---
         | 
         | Either these companies are harboring abusers and pedos - or
         | they are are complicit or profiting from them.
         | 
         | I am so cynical about this - because I dont trust ANY tech
         | company to tell me they have a tech solution for "child abuse"
         | -- when they dopn't have a single success story in anti-
         | ANYTHING-for the good-of-all.
         | 
         | Where are the dictators, abusers, molesters, murderers, etc etc
         | etc that have been taken down by any of these measures?
         | 
         | You think APPLE is going to be able to prevent child abuse?
         | 
         | They cant even get rid of hyper-planned-obsolescence from their
         | product lines which massively impacts the environment.
         | 
         | Apple is just continuing the douchebag line of "look at the
         | good we are doing" Here put more suicide nets up in the slave
         | factories where children make our products - we will be the
         | frontman for "anti-child-abuse"
         | 
         | Have you seen undercover video from their and other tech
         | factories in china?
        
           | torstenvl wrote:
           | The work of Facebook's illicit media team has led to many,
           | many prosecutions. They intentionally keep quiet about it
           | because the reaction to a headline like "500-member Child
           | Porn Ring busted on Facebook" isn't "Geez, I'm glad Facebook
           | is keeping us safe," it's "Wow, maybe we shouldn't let our
           | teenagers on Facebook" -- a reaction that significantly hurts
           | their bottom line, and tips off the ChiPo folks besides.
           | 
           | Source: my own experiences in the criminal justice system and
           | _Chaos Monkeys_ , by Antonio Garcia-Martinez (a Y Combinator
           | alum!).
        
             | samstave wrote:
             | >" _the reaction to a headline like "500-member Child Porn
             | Ring busted on Facebook" isn't "Geez, I'm glad Facebook is
             | keeping us safe," it's "Wow, maybe we shouldn't let our
             | teenagers on Facebook"_
             | 
             | -- Exactly. Fuck facebook.
             | 
             | If they wanted more credibility it wouldn't be about
             | "making the bottom line a more profitable place"
             | 
             | As opposed to the bullshit " _making the world a better
             | place_ "
        
               | torstenvl wrote:
               | I can tell that you're angry at Facebook. However, I
               | don't really understand why. You're upset that they
               | aren't taking more public credit? Perhaps this is a
               | cultural difference, but I've never been exposed to a
               | community where _not_ taking credit violates social
               | values. Help me understand?
        
               | samstave wrote:
               | Put your email in your profile, plz
        
             | fortran77 wrote:
             | > Source: my own experiences in the criminal justice system
             | and Chaos Monkeys, by Antonio Garcia-Martinez (a Y
             | Combinator alum!).
             | 
             | Makes me think even harder about the _real_ reason Apple
             | canceled him.
        
           | suizi wrote:
           | There have certainly been busts in the media, including some
           | depraved individuals who have blackmailed teenagers into
           | sending them images, one of which set the dangerous precedent
           | of tech companies developing exploits, and refusing to
           | disclose them after the fact.
           | 
           | It isn't terribly surprising that a platform like Facebook,
           | which has a lot of children on it, would end up attracting
           | predators who seek to prey on them. Fortunately, Facebook has
           | been deploying a number of tools to improve their safety over
           | the past few years which don't rely on surveillance or even
           | censorship.
           | 
           | Statistically, there have been a number of arrests which have
           | been a product of their activities, although I don't have
           | much info on those. Someone else may.
           | 
           | The real question is whether it is worth sacrificing
           | everyone's privacy, so that a few people can be arrested.
           | 
           | I can imagine iCloud being a lower risk platform than
           | Facebook. Someone can't really groom someone into uploading
           | photos, although the existence of such images is still very
           | condemnable.
        
       | Guthur wrote:
       | I think it's becoming very apparent that through apathy,
       | indoctrination, and fear that freedom will be well and truly
       | stamped out.
       | 
       | You just have to say for the greater good and you can get away
       | with anything. Over the last year and half so many have been
       | desensitised to over bearing collectivism that at this stage i
       | think governments and their any Big Corp lackeys could get away
       | with just about anything now.
        
       | mulmen wrote:
       | Will my photos still be scanned if I do not use iCloud Photos?
        
         | suizi wrote:
         | No, but "we've got to hunt down every image" isn't really an
         | argument which ends there, so it might go further in the
         | future.
        
           | mulmen wrote:
           | I'm not defending Apple here. I'm trying to determine if I
           | need to ditch my iPhone or just disable iCloud.
        
       | geraneum wrote:
       | Didn't they [Apple] make the same points that EFF is making now,
       | to avoid giving FBI a key to unlock an iOS device that belonged
       | to a terrorist?
       | 
       | " Compromising the security of our personal information can
       | ultimately put our personal safety at risk. That is why
       | encryption has become so important to all of us."
       | 
       | "... We have even put that data out of our own reach, because we
       | believe the contents of your iPhone are none of our business."
       | 
       | " The FBI may use different words to describe this tool, but make
       | no mistake: Building a version of iOS that bypasses security in
       | this way would undeniably create a backdoor. And while the
       | government may argue that its use would be limited to this case,
       | there is no way to guarantee such control."
       | 
       | Tim Cook, 2016
        
         | rubatuga wrote:
         | Think of the children!!!
        
       | iamnotwhoiam wrote:
       | If sexual images exchanged by a kid are saved to the parent's
       | phone then doesn't that put the parent at risk for charges if the
       | kids are sexting?
        
       | avnigo wrote:
       | > Once a certain number of photos are detected, the photos in
       | question will be sent to human reviewers within Apple, who
       | determine that the photos are in fact part of the CSAM database.
       | If confirmed by the human reviewer, those photos will be sent to
       | NCMEC, and the user's account disabled.
       | 
       | Chilling. Why have human reviewers, unless false positives are
       | bound to happen (this is of 100% certainty with the aggregate
       | amount of photos to be scanned)?
       | 
       | So, in effect, Apple has hired human reviewers to police your
       | photos that an algorithm has flagged. Whether you knowingly
       | consent to or not (through some fine print), you are being
       | subjected to a search _without probable cause_.
       | 
       | This is not the future I was looking forward to.
        
       | bississippi wrote:
       | First they built a walled garden beautiful on the inside and
       | excoriated competitors [1] for their lack of privacy. Now that
       | the frogs have walked into the walled garden, they have started
       | to boil the pot [2] . I don't think the frogs will ever find out
       | when to get off the pot.
       | 
       | [1] https://www.vox.com/the-goods/2019/6/4/18652228/apple-
       | sign-i...
       | 
       | [2] https://en.wikipedia.org/wiki/Boiling_frog
        
         | [deleted]
        
       | roamerz wrote:
       | Bad Apple. Today it is something socially unacceptable - child
       | exploitation. The reason that is used as a reason is plainly
       | obvious. What will be the next socially unacceptable target?
       | Guess it depends on who the ruling class. Very disappointed in
       | this company's decision.
        
         | system2 wrote:
         | I don't think there is another one like this subject. Let's see
         | if Samsung and other android phones add this type of stuff
         | soon.
        
       | xbmcuser wrote:
       | Apple scanning for law enforcement in 1 country gives proof of
       | concept for another country to ask for the same for their own
       | laws. And with a big enough market can easily arm twist Apple to
       | comply as $$ means more than all privacy they talk about.
        
       | roody15 wrote:
       | My two cents: I get the impression this is related to NSO pegasus
       | software. So once the Israeli firms leaks were made public Appple
       | had to respond and has patched some security holes that were
       | exposed publicly.
       | 
       | NSO used exploits in iMessage to enable them to grab photos,
       | texts among other things.
       | 
       | Now shortly after Apple security patches we see them pivot and
       | now want to "work" with law enforcement. Hmmm almost like once
       | access was closed Apple needs a way to justify "opening" access
       | to devices.
       | 
       | Yes I realize this could be a stretch based on the info. Just
       | seems like an interesting coincidence... back door exposed and
       | closed.... now it's back open... almost like governments demand
       | access
        
         | MichaelMoser123 wrote:
         | I guess it doesn't matter, the smartphone is a tracking device
         | by definition, they can track your movement with a dumb phone
         | too, but there are much more possibilities in a device with
         | recording capabilities and an internet connection. In Orwells
         | '1984' they mandated the installation of a televisor tracking
         | device, now they have one in every pocket, and so it goes that
         | we traded privacy for convenience. It's a bit of an irony, that
         | Apple started with the big brother commercial, and ended up
         | bringing us the televisor.
         | https://www.youtube.com/watch?v=zIE-5hg7FoA Just because
         | opportunity for an exploit is creating the reality of using
         | that exploit, it seems as if it is then used for it's intended
         | purpose..
        
       | Spooky23 wrote:
       | This article is irresponsible hand-waving.
       | 
       | " When Apple releases these "client-side scanning"
       | functionalities, users of iCloud Photos, child users of iMessage,
       | and anyone who talks to a minor through iMessage will have to
       | carefully consider their privacy and security priorities in light
       | of the changes, and possibly be unable to safely use what until
       | this development is one of the preeminent encrypted messengers."
       | 
       | People sending messages to minors that trigger a hash match have
       | more fundamental things to consider, as they are sending known
       | photos of child exploitation to a minor.
       | 
       | The EFF writer knows this, as they describe the feature in the
       | article. They should be ashamed of publishing this crap.
        
         | morpheuskafka wrote:
         | You've got it mixed up. The messages are scanned for any
         | explicit material (which in many but not all cases is illegal),
         | not specific hash matches. That's only for uploads to iCloud
         | Photos.
         | 
         | Additionally, you are not "obliged" to report such photos to
         | the police. Uninvolved service providers do have to submit some
         | sort of report iirc, but to require regular users to do so
         | would raise Fifth Amendment concerns.
        
           | Spooky23 wrote:
           | No, I'm not. You're confusing the issues. If you have a
           | child's account, joined to _your_ family group, _you_ will
           | get alerted about explicit images -- if you decide to use it.
           | 
           | The photos that you are obliged to report are child
           | pornography that match a hash in a database used everywhere.
           | If you don't report, you're in a place where you may have
           | criminal liability.
        
         | itake wrote:
         | > they are sending known photos of child exploitation to a
         | minor
         | 
         | How do you know its a known photo of child exploitation? The
         | original image that was hashed and then deleted. Two completely
         | different images have the same hash.
         | 
         | WhatsApp automatically saves images to photos. What if you
         | receive a bad image and are reported due to someone else
         | sending the image to you?
        
           | Spooky23 wrote:
           | You're obliged to report that image to the police. These
           | types of images are contraband.
        
             | itake wrote:
             | > You're obliged to report that image to the police.
             | 
             | Is this a legal obligation for all countries that iPhones
             | operate in? I wasn't able to find a law via a quick google
             | search for the US.
             | 
             | For US law, are there protections for people that report
             | the contraband? I'm not sure if good samaritan or whistle
             | blower laws protect you.
        
       | citizenpaul wrote:
       | I fully support this. History has shown us that humanity and
       | especially their governments are very well equipped to deal with
       | near godlike power of surveillance. There are basically no
       | examples of this power being abused through all of history. Maybe
       | a couple of bad apples. We should really look into how this can
       | be expanded. Imagine if crime could be stopped before it starts.
        
       | neilv wrote:
       | The article spends time on the implications for kids messaging
       | other kids. Though I think parents as a group might tend to lean
       | more towards _wanting_ that snooping going on.
       | 
       | Separate from kids, I wonder whether Apple's is yet shooting
       | itself in the foot for _teens_.
       | 
       | Teens should start caring about privacy around then, are very
       | peer/fashion-sensitive, and have shown that they'll readily
       | abandon platforms. Many parents/teachers/others still want to be
       | treating teens as children under their power, but teens have
       | significant OPSEC motivation and ability.
       | 
       | Personally, I'd love to see genuinely good privacy&security
       | products rushing to serve the huge market of a newly clueful
       | late-teen generation. The cluefulness seems like it would be good
       | for society, and market forces mean the rest of us then might
       | also be able to buy products that aren't ridiculously insecure
       | and invasive.
        
       | jimt1234 wrote:
       | > ... a thoroughly documented, carefully thought-out, and
       | narrowly-scoped backdoor is still a backdoor.
        
       | TroisM wrote:
       | at least they dont lie about their spying on your device
       | anymore...
        
       | hashslingslash wrote:
       | Apple has become the hash slinging slasher.
       | 
       | I am furious.
        
       | alfiedotwtf wrote:
       | Let's call it out for what it is - Apple's Dragnet.
        
       | akouri wrote:
       | Nobody is talking about the performance implications to the
       | photos and messages app. All these image hashes and private set
       | intersection operations are going to eat CPU and battery life.
       | 
       | This is the downside to upgrading your iOS version. Once you
       | update, it's not like you can go back, either. You're stuck with
       | a slower, more power-hungry phone for the life of the phone.
        
         | system2 wrote:
         | Depends on how many photos you receive or take per day. I don't
         | think it would be significantly different.
        
       | temeritatis wrote:
       | the road to hell is paved with good intentions
        
       | fetzu wrote:
       | I honestly fail to see how the "oppressive regimes could just
       | turn the on-device scanning into a state surveillance tool" is
       | not a slippery slope arguments when on-device scanning and
       | classification (NN for image processing and classification) has
       | been going on for years on iOS devices.
       | 
       | It just seem very paradoxical to be using a cloud based photo
       | and/or un-encrypted backup service and then worry about one's
       | privacy being at risk.
        
       | NazakiAid wrote:
       | Wait until a corrupt govenment starts forcing Apple or Microsoft
       | to scan for leaked documents exposing them and then automatically
       | notifying them. Just one of the many ways this could go wrong in
       | the future.
        
       | [deleted]
        
       | j1elo wrote:
       | I'm not sure what's the point; in this day and age, I'm pretty
       | sure that if your 14 years old wants to send a nude picture, if
       | they _really_ have already reached to that decision, they _will_
       | do it.
       | 
       | The only practical barrier here is that their parents have
       | educated them and their mental model arrives by its own at "no,
       | this is a very bad idea" instead of "yes, I want to send this
       | pic". Anything else, including petty prohibitions from their
       | parents, will not be a decision factor in most cases. Have we
       | forgotten how it was to be a teenager?
       | 
       | (I mean people, both underage and criminals, will just learn to
       | avoid apple and use other channels)
        
         | mfer wrote:
         | That's not what this does. Articles aren't communicating the
         | details well. There's a set of known photos of kids going
         | around. They are looking for those specific photos. It's hash
         | based checks
        
           | j1elo wrote:
           | Yeah I see the detail about matching hashes with well known
           | images from a database... but what triggered my comment is
           | this other function that is mentioned:
           | 
           | > _The other feature scans all iMessage images sent or
           | received by child accounts--that is, accounts designated as
           | owned by a minor--for sexually explicit material, and if the
           | child is young enough, notifies the parent_
           | 
           | Which seems to be a feature that would allow parents to fix
           | with prohibitions what they didn't achieve with education.
        
           | majjam wrote:
           | Thats the first feature, the second is, from the article:
           | "The other feature scans all iMessage images sent or received
           | by child accounts--that is, accounts designated as owned by a
           | minor--for sexually explicit material, and if the child is
           | young enough, notifies the parent when these images are sent
           | or received. This feature can be turned on or off by
           | parents."
        
       | m3kw9 wrote:
       | Gonna get downvoted for this, I maybe the few that supports this
       | and I hope they catch these child exploiters by the boat load and
       | save 1000s of kids from traffickers and jail their asses
        
         | pseudalopex wrote:
         | The child pornography detection only tries to find known child
         | pornography. It does nothing to stop traffickers.
        
       | panny wrote:
       | I left Apple behind years ago after using their gear for more
       | than a decade. I recently received a new M1 laptop from work and
       | liked it quite a bit. It's fast, it's quiet, it doesn't get hot.
       | I liked it so much, that I was prepared to go back full Apple for
       | a while. I was briefly reviewing a new iPhone, a M1 mini as a
       | build server, a display, and several accessories to go along with
       | a new M1 laptop for myself. (I don't like to mix work and
       | personal)
       | 
       | Then this news broke. Apple, you just lost several thousand
       | dollars in sales from me. I had items in cart and was pricing
       | everything out when I found this news. I will spend my money
       | elsewhere. This is a horrendous blunder. I will not volunteer
       | myself up to police states by using your gear now or ever again
       | in the future. I've even inquired about returning the work laptop
       | in exchange for a Dell.
       | 
       | Unsafe at any speed. Stallman was right. etc etc etc.
        
       | shmerl wrote:
       | Is anyone even using Apple if they care about privacy and
       | security?
        
       | cwizou wrote:
       | The FT article mentioned it was US only, but I'm more afraid of
       | how other governments will try to pressure Apple to adapt said
       | technology to their needs.
       | 
       | Can they trust _random_ government to give them a database of
       | only CSAM hashes and not insert some extra politically motivated
       | content that they deem illegal ?
       | 
       | Because once you've launched this feature in the "land of the
       | free", other countries will require for their own needs their own
       | implementation and demand (through local legislation which Apple
       | will need to abide to) to control said database.
       | 
       | And how long until they also scan browser history for the same
       | purpose ? Why stop at pictures ? This is opening a very dangerous
       | door that many here will be uncomfortable with.
       | 
       | Scanning on their premises (considering they can as far as we
       | know ?) would be a much better choice, this is everything but (as
       | the "paper" linked tries to say) privacy forward.
        
         | aalam wrote:
         | The initial rollout is limited to the US, with no concrete
         | plans reported yet on expansion.
         | 
         | "The scheme will initially roll out only in the US. [...]
         | Apple's neuralMatch algorithm will continuously scan photos
         | that are stored on a US user's iPhone and have also been
         | uploaded to its iCloud back-up system."
         | 
         | Researchers interviewed for the article would agree with your
         | analysis. "Security researchers [note: appears to be the named
         | security professors quoted later in the article], while
         | supportive of efforts to combat child abuse, are concerned that
         | Apple risks enabling governments around the world to seek
         | access to their citizens' personal data, potentially far beyond
         | its original intent."
         | 
         | Article link for ease of access:
         | https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f54...
        
           | cwizou wrote:
           | Thanks, after some fiddling I managed to finally read the
           | full text from the article and it's definitely short on
           | details on the rollout. Let's hope they rethink this.
           | 
           | I'm also fairly concerned about the neural part behind the
           | name, which I hope is just (incredibly poor) marketing around
           | the perceptive hash thing.
        
       | [deleted]
        
       | dsign wrote:
       | Apple is not a dumb company, they did this fully knowing of the
       | backslash they would receive, very likely impacting their bottom
       | line. Two scenarios come to mind:
       | 
       | 1. They expect must people will shrug and let themselves be
       | scanned. That is, this privacy invasion will result in minimal
       | damage to the Apple brand, or
       | 
       | 2. They know privacy-savvy people will put them from now on on
       | the same league with Android, and they are prepared to take the
       | financial loss.
       | 
       | Scenario 1 is the most plausible, though it hints an impish lack
       | of consideration for their customers.
       | 
       | Scenario 2 worries me most. No smart company does something
       | counter-productive financially unless under dire pressure. What
       | could possibly make Apple shoot itself on the foot and announce
       | it publicly? In other words, Apple's actions, from my
       | perspective, look like a dead canary.
        
       | falcolas wrote:
       | Apple,
       | 
       | Not that you care, but this is the straw that's broken this
       | camel's back. It's too ripe for abuse, it's too invasive, and I
       | don't want it.
       | 
       | You've used one of the Four Horsemen of the Infocalypse
       | perfectly... and so I'm perfectly happy to leave your ecosystem.
       | 
       | Cheers.
        
       | nick_naumov wrote:
       | Goodbye Apple! I have trusted you for 12 years. All I wanted was
       | you to trust me.
        
       | FpUser wrote:
       | Luckily I only use phone to make phone calls, offline GPS and to
       | control some gizmos like drones. Do not even have data plan. Not
       | an Apple customer either so I guess my exposure to things
       | mentioned is more limited.
        
       | bogomipz wrote:
       | The article states:
       | 
       | >"The (unauditable) database of processed CSAM images will be
       | distributed in the operating system (OS), the processed images
       | transformed so that users cannot see what the image is, and
       | matching done on those transformed images using private set
       | intersection where the device will not know whether a match has
       | been found"
       | 
       | Am I reading this correctly in that Apple will essentially be
       | pushing out contraband images to user's phones? Couldn't the
       | existence of these images on a user's phone potentially have
       | consequences and potentially be used against an unwitting iPhone
       | user?
        
       | imranhou wrote:
       | I think it's easy to say no to any solution, but harder to say
       | "this is bad, but we should do this instead to solve the
       | problem". In a world with ubiquitous/distributed communication,
       | the ideas that come up would generally avoid direct interception
       | but need some way to identify a malicious transaction.
       | 
       | When saying no to ideas like this, we should at the same time
       | attempt to also share our thoughts on what would be an acceptable
       | alternative solution.
        
         | suizi wrote:
         | It's not that I don't want to give you a sunny solution which
         | makes the problem go away forever, but this is an extremely
         | difficult problem to solve, especially as someone might be
         | located in some foreign country with ineffective law
         | enforcement.
         | 
         | Facebook has been making it harder for random strangers to
         | contact people under a certain age, so that may well help, and
         | we'll see if it does. And we could probably teach teenagers how
         | to remain safe on the internet, and give the support needed to
         | not be too emotionally reliant on the internet. That might get
         | you part of the way.
         | 
         | You could run TV advertisements to raise awareness about how
         | abuse is harmful to try to dissuade people from doing it, but
         | that might make the general public more scared of it (the
         | chances their family specifically will be affected has to be
         | remote), and more inclined to "regulate" their way out of the
         | problem.
         | 
         | You could try to take more children away from their families on
         | the off-chance they may have been abused, but what if you make
         | the wrong call? That could be traumatizing to them.
         | 
         | You could go down the road of artificial child porn to compete
         | with child porn, and robots which look like children, but I
         | don't think the predators specifically are interested in those,
         | are they? And that comes with some serious ethical issues, and
         | is politically impossible.
         | 
         | We can't just profile "whoever looks suspicious" on the street,
         | because people who are mentally ill tend to behave erratically,
         | only have a slightly high chance of being guilty, but have a
         | dramatically high chance of being harassed by police.
         | 
         | If we can get out of the covid pandemic, this may help. Child
         | abuse is said to have risen by a factor of 4 during the
         | lockdowns, and all those other things which were put in place
         | to contain the virus. It's possible that stress from the
         | pandemic, and perhaps, opportunities to commit a crime may have
         | contributed to this. But, this is an international problem,
         | even if the pandemic were to vanish in the U.S., it may still
         | exist overseas.
        
         | cwizou wrote:
         | I think everyone is offended on scanning being done on device
         | and not on their servers (which I had assumed they might
         | already did, quite frankly, Google Photos and others already
         | do), and selling that as being privacy forward.
         | 
         | Considering they hold the keys and the scheme already allows
         | them to decrypt as a last step the users photos, this is not
         | exactly a progress. It just maintains the illusion that those
         | backups are encrypted while they (ultimately) aren't.
         | 
         | I've personally (and some may disagree) always assumed that
         | anything you put in any cloud (and that includes the very
         | convenient iCloud backups that I use) is fair game for local
         | authorities, whether that's true in practice or not.
         | 
         | Putting a "snitch" on device, even if it's only for content
         | that's going to the cloud (and in the case of an iCloud backup,
         | doesn't that mean _all_ your iPhone content ?) is the part that
         | goes a step too far and will lead to laws in other countries
         | asking for even more.
         | 
         | Once you've opened the door to on device scanning, why limit it
         | to data that goes to iCloud ? Why limit it to photos ? They
         | proved they have the "tech" and governments around the world
         | will ask for it to be bent to their needs.
         | 
         | I'm sure the intent was well meaning but I'd much rather they
         | just do this on their premises and not try to pretend they do
         | this for privacy.
        
           | imranhou wrote:
           | Imagine someone was hired to reduce the problem of child
           | trafficking/exploitation, and are the head of this group at
           | the justice dept. Lets say they have the option to work with
           | private orgs that may have solutions that could walk a fine
           | line between privacy and their dept goals.
           | 
           | I'm interested in knowing your perspective on how one should
           | approach achieving these goals.
        
             | cwizou wrote:
             | Quite frankly I'm not sure this should be the role of one
             | employee of a justice department, so I'm not sure how to
             | answer this ?
             | 
             | At the end of the day I think that any such effort should
             | be legislated, in any jurisdiction, and not just rely on
             | the good will that a government (or one of its employee)
             | can garner from private orgs.
             | 
             | As to what legislation should look like, this is a complex
             | issue, many countries are making various form of
             | interceptions legal (mostly around terrorism) already.
             | 
             | Should a law clearly mandating that any cloud provider scan
             | their users content through some technology like
             | Microsoft's PhotoDNA be passed by various local governments
             | ? I'd much rather see that, personally.
             | 
             | Again, my opposition to this is centered around the fact
             | that Apple is doing this on device _and_ selling this as,
             | as you put it, walking a fine line with their privacy
             | stance.
             | 
             | While it may have been their original intent, I believe
             | they have opened a door for legislators around the world to
             | ask this technology be extended for other purposes, and
             | given time, they absolutely will.
             | 
             | You didn't ask but what I wish Apple did was what everyone
             | else did : scan on their servers, because they can, and not
             | do it on device to keep the illusion that they can't access
             | your photos.
             | 
             | Edit : Random link about PhotoDNA being used by Google and
             | Facebook, there are probably better sources :
             | https://petapixel.com/2014/08/08/photodna-lets-google-
             | facebo...
        
               | cwizou wrote:
               | Can't edit any more but as provided in another thread by
               | another user, they were _already_ , as I had originally
               | assumed, doing it server side :
               | https://nakedsecurity.sophos.com/2020/01/09/apples-
               | scanning-...
        
       | viktorcode wrote:
       | On device data scan, however well-intended it may be, is an
       | invasion of privacy. Server scan is entirely different matter,
       | because it is an optional service which may come with any clauses
       | its provider may deem necessary.
       | 
       | I understand that it doesn't scan everything, but it don't
       | matter. What matter is there's an implemented technical
       | capability to run scans against external fingerprint database.
       | it's a tool which may be used for many needs.
       | 
       | I hope some countries will prohibit Apple doing that. Germany
       | with its strict anti-snooping laws comes to mind. Maybe Japan.
       | The more, the better.
       | 
       | Oh, and by the way, every tech-savvy sex predator now knows what
       | they should avoid doing. As always with mass privacy invasions:
       | criminals are the last to suffer from it.
        
       | trangus_1985 wrote:
       | I've been maintaining a spare phone running lineage os exactly in
       | case something like this happened - I love the apple watch and
       | apple ecosystem, but this is such a flagrant abuse of their
       | position as Maintainers Of The Device that I have no choice but
       | to switch.
       | 
       | Fortunately, my email is on a paid provider (fastmail), and my
       | photos are on a NAS, I've worked hard to get all of my friends on
       | Signal. While I still use google maps, I've been trialing out OSM
       | alternatives for a minute.
       | 
       | The things they've described are in general, reasonable and
       | probably good in the moral sense. However, I'm not sure that I
       | support what they are implementing for child accounts (as a queer
       | kid, I was terrified of my parents finding out). On the surface,
       | it seems good - but I am concerned about other snooping features
       | that this portents.
       | 
       | However, with icloud photos csam, it is also a horrifying
       | precedent that the device I put my life into is scanning my
       | photos and reporting on bad behavior (even if the initial dataset
       | is the most reprehensible behavior).
       | 
       | I'm saddened by Apple's decision, and I hope they recant, because
       | it's the only way I will continue to use their platform.
        
         | [deleted]
        
         | rasengan wrote:
         | Your original post said postmarketOS. That is weird that you
         | changed it to lineage (and misspelled that).
        
           | trangus_1985 wrote:
           | Yeah, sorry, I mixed them up in my head. I'm currently
           | running Lineage on a PH-1, not Postmarket. I would not
           | consider what I have set up to be "production ready", but I'm
           | going to spend some time this weekend looking into what
           | modern hardware can run Lineage or other open mobile OSes
        
             | hsousa wrote:
             | Lineage OS is 100% production ready, it's been my daily
             | driver for almost 2 years and I've been Google and apple -
             | free since.
        
               | trangus_1985 wrote:
               | Sorry, wasn't ripping on Lineage. It's more the entire
               | ecosystem. I mentioned in prior comments, but I think
               | that in a few years we'll have a practical, open source,
               | third party in the mobile phone os wars - one that has
               | reasonable app coverage.
               | 
               | I don't care if I use google or apple services, btw, I
               | just want the data flow to be on my terms.
        
           | trangus_1985 wrote:
           | Oh hey wait you're the freenode guy. While we're on the topic
           | of hostile actions by a platform provider...
        
             | noasaservice wrote:
             | He's uh, a prince, or something. (Probably got the crown
             | out of a cereal box.)
             | 
             | But from the looks of his numbers ( https://upload.wikimedi
             | a.org/wikipedia/commons/8/83/IRC_top_... ) he's doing a
             | real bang-up job!
        
             | rasengan wrote:
             | Doesn't change that you're a liar.
        
           | hncurious wrote:
           | Why is that weird?
        
         | [deleted]
        
         | artimaeis wrote:
         | It's not the device that's less secure or private in this
         | context, it's the services. There's no reason you couldn't just
         | continue using your NAS for photo backup and Signal for
         | encrypted-communications completely unaffected by this.
         | 
         | Apple seems to not have interest in users devices, which makes
         | sense -- they're not liable for them. They _do_ seem interested
         | in protecting the data that they house, which makes sense,
         | because they're liable for it and have a responsibility to
         | remove/report CSAM that they're hosting.
        
           | trangus_1985 wrote:
           | That's not the issue. The issue is that they have shipped
           | spyware to my device. That's a massive breach of trust.
           | 
           | I suspect that this time next year, I'll still be on ios,
           | despite my posturing. I'm certainly going to address icloud
           | in the next few weeks - specifically, disusing it. However, I
           | would be surprised if I'm still on ios a year or two after
           | that.
           | 
           | What Apple has done here isn't horrible in the absolute
           | sense. Instead, it's a massive betrayal of trust with minimal
           | immediate intrusiveness; and yet, a giant klaxon that their
           | platform dominance in terms of privacy is coming to an end
        
           | [deleted]
        
           | adriancr wrote:
           | So they should do that scanning server side at their boundary
           | instead of pushing software to run on phones with potential
           | to extend scope later if no push back.
        
             | gowld wrote:
             | They don't want to do it serve side because they don't want
             | to see your unencrypted data!
        
               | dustyharddrive wrote:
               | Apple already holds the key to iCloud Photos content, and
               | regularly responds to search warrants.
        
               | adriancr wrote:
               | Well good thing they're looking at it anyway client
               | side...
        
         | Andrew_nenakhov wrote:
         | Signal is still a centralised data silo where by default you
         | trust CA to verify your contacts identify.
        
           | chimeracoder wrote:
           | > Signal is still a centralised data silo where by default
           | you trust CA to verify your contacts identify.
           | 
           | You can verify the security number out-of-band, and the
           | process is straightforward enough that even nontechnical
           | users can do it.
           | 
           | That's as much as can possibly be done, short of an app that
           | literally prevents you from communicating with anyone without
           | manually providing their security number.
        
             | dustyharddrive wrote:
             | I agree Signal's default security is a whole lot better
             | than iMessage, which trusts Apple for key exchange and
             | makes it impossible to verify the parties or even the
             | number of parties your messages are being signed for.
             | Default security is super important for communication apps
             | because peers are less likely to tweak settings and know
             | about verification screens.
             | 
             | Aside: Signal data never touches iCloud Backup
             | (https://support.signal.org/hc/en-
             | us/articles/360007059752-Ba...). That's an improvement over
             | a _lot_ of apps.
        
             | Andrew_nenakhov wrote:
             | I said, 'by default'. I know that it is possible to do a
             | manual verification, but I am yet to have a chat with a
             | person who would do that.
             | 
             | Also, the Signal does not give any warnings or indication
             | that chat partner identify is manually verified. Users are
             | supposed to trust Signal and not ask difficult questions
        
               | chimeracoder wrote:
               | > I said, 'by default'. I know that it is possible to do
               | a manual verification, but I am yet to have a chat with a
               | person who would do that.
               | 
               | I'm not sure what else you'd expect. The alternative
               | would be for Signal not to handle key exchange at all,
               | and only to permit communication after the user manually
               | provides a security key that was obtained out-of-band.
               | That would be an absolutely disastrous user experience.
               | 
               | > Also, the Signal does not give any warnings or
               | indication that chat partner identify is manually
               | verified
               | 
               | That's not true. When you verify a contact, it adds a
               | checkmark next to their name with the word "verified"
               | underneath it. If you use the QR code to verify, this
               | happens automatically. Otherwise, if you've verified it
               | manually (visual inspection) you can manually mark the
               | contact as verified and it adds the checkmark.
        
               | Andrew_nenakhov wrote:
               | > I'm not sure what else you'd expect.
               | 
               | Ahem. I'd expect something that most xmpp clients could
               | do 10+ years ago with OTR: after establishing an
               | encrypted session the user is given a warning that chat
               | identify of a partner is not verified, and is given
               | options on how to perform this verification.
               | 
               | With CA you can make a mild warning that identity is
               | verified by Signal, and give an options to dismiss
               | warning or perform off-the-band verification.
               | 
               | Not too disastrous, no?
               | 
               | > That's not true. When you verify a contact, it adds a
               | checkmark next to their name with the word "verified"
               | 
               | It has zero effect if the user is given no indication
               | that there should be the word _verified_.
               | 
               | It is not true what you say. _This_ [1] is what a new
               | user sees in Signal - absolutely zero indication. To
               | verify a contact user must go to  "Conversation settings*
               | and then "View safety number". I'm not surprised nobody
               | ever established a verified session with me.
               | 
               | [1]: https://www.dropbox.com/s/ab1bvazg4y895f6/screenshot
               | _2021080...
        
               | int_19h wrote:
               | I did this with all my friends who are on Signal, and
               | explained the purpose.
               | 
               | And it does warn about the contact being unverified
               | directly in the chat window, until you go and click
               | "Verify". The problem is that people blindly do that
               | without understanding what it's for.
        
               | Andrew_nenakhov wrote:
               | Please show me this warning in this [1] freshly taken
               | screenshot from Signal.
               | 
               | [1]: https://www.dropbox.com/s/ab1bvazg4y895f6/screenshot
               | _2021080...
        
               | tekknik wrote:
               | Tap the user, then the name at the top, then "View Safety
               | Number". I'm not sure if there's another warning less
               | buried.
        
               | Andrew_nenakhov wrote:
               | That's the point, see my other comment [1]. User has to
               | know about it to activate manual verification, and by
               | default he just has to trust Signal's CA that his contact
               | is, indeed, the one he is talking to.
               | 
               | [1]:https://news.ycombinator.com/item?id=28081152
        
               | int_19h wrote:
               | Hm, you're right. What I was thinking of is the safety
               | number _change_ notification. But if you start with a
               | fresh new contact, it 's unverified, but there's no
               | notification to that effect - you have to know what to do
               | to enable it.
        
               | Andrew_nenakhov wrote:
               | yes, that is exactly what I am talking about.
        
           | trangus_1985 wrote:
           | Yeah, but it's also useful for getting my friends on board. I
           | think it's likely that I eventually start hosting matrix or
           | some alternative, but my goal is to be practical here, yet
           | still have a privacy protecting posture.
        
             | playguardin wrote:
             | What is matrix?
        
             | Sunspark wrote:
             | Your friends aren't going to want to install an app to have
             | it connect to trangus_1985's server. Be happy just getting
             | them on Signal.
        
               | trangus_1985 wrote:
               | My friends are significantly more technical (and
               | paranoid) than the average user. We've already discussed
               | it.
               | 
               | But... yeah. Yeah. Which is why I got as many people on
               | Signal as I could. Baby steps. The goal here, right now,
               | is reasonable privacy, not perfection.
        
         | gowld wrote:
         | > I'm not sure that I support what they are implementing for
         | child accounts (as a queer kid, I was terrified of my parents
         | finding out)
         | 
         | If you don't want your parents to look at your phone, you
         | shouldn't be using a phone owned by your parent's account. The
         | new feature doesn't change this calculus.
         | 
         | As a queer kid, would you enjoy being blackmailed by someone
         | who tricked you into not telling your parents?
        
         | peakaboo wrote:
         | I also use Fastmail but being fully aware that Australia where
         | its hosted is part of the 5 eyes spy network, and also one of
         | the countries acting extreamly oppressive towards its citizens
         | when it comes to covid restrictions.
         | 
         | So I don't actually expect my mail to be private. But at least
         | it's not Google.
        
         | Saris wrote:
         | I think no matter what devices you use, you've nailed down the
         | most important part of things which is using apps and services
         | that are flexible, and can be easily used on another platform.
        
           | trangus_1985 wrote:
           | I knew that eventually it'd probably matter what devices I
           | used, I just didn't expect it to be so soon.
           | 
           | But yeah, I could reasonably use an iphone without impact for
           | the foreseeable future with some small changes.
        
         | ekianjo wrote:
         | Signal is next on the list since it's a centralized solution -
         | you can expect they will come for it next.
        
           | trangus_1985 wrote:
           | I'm just trying to buy time until open source and secure
           | alternatives have addressed these problems. Apple doing this
           | has moved my timeframes up by a few years (unexpectedly).
        
         | OJFord wrote:
         | > While I still use google maps
         | 
         | I use Citymapper simply because I find it better (for the city-
         | based journeys that are my usual call for a map app) - but it
         | not being a Google ~data collection device~ service is no
         | disadvantage.
         | 
         | At least, depending why you dislike having everything locked up
         | with Google or whoever I suppose. Personally it's more having
         | _everything_ somewhere that troubles me, I 'm reasonably happy
         | with spreading things about. I like self-hosting things too,
         | just needs a value-add I suppose, that's not a reason in itself
         | _for me_.
        
         | JumpCrisscross wrote:
         | > _with icloud photos csam, it is also a horrifying precedent_
         | 
         | I'm not so bugged by this. Uploading data to iCloud has always
         | been a trade of convenience at the expense of privacy. Adding a
         | client-side filter isn't great, but it's not categorically
         | unprecedented--Apple executes search warrants against iCloud
         | data--and can be turned off by turning off iCloud back-ups.
         | 
         | The scanning of childrens' iMessages, on the other hand, is a
         | subversion of trust. Apple spent the last decade telling
         | everyone their phones were secure. Creating this side channel
         | opens up all kinds of problems. Having trouble as a controlling
         | spouse? No problem--designate your partner as a child.
         | Concerned your not-a-tech-whiz kid isn't adhering to your
         | house's sexual mores? Solved. Bonus points if your kid's phone
         | outs them as LGBT. To say nothing of most sexual abuse of
         | minors happening at the hands of someone they trust. Will their
         | phone, when they attempt to share evidence, tattle on them to
         | their abuser?
         | 
         | Also, can't wait for Dads' photos of their kids landing them on
         | a national kiddie porn watch list.
        
           | NotPractical wrote:
           | > can be turned off by turning off iCloud back-ups
           | 
           | Until they push a small change to the codebase...
           | @@ -7637,3 +7637,3 @@       -if (photo.isCloudSynced &&
           | scanForIllegalContent(photo)) {       +if
           | (scanForIllegalContent(photo)) {
           | reportUserToPolice();        }
        
             | fomine3 wrote:
             | This has been a risk by using closed source OS.
        
               | tedivm wrote:
               | Even open source operating systems have closed source
               | components, and unless you're in charge of the entire
               | distribution chain you can't be source used to compile it
               | was the same that was shared with you. On top of that
               | most devices have proprietary systems inside of their
               | hardware that the OS can't control.
               | 
               | So it would be better to say "this has been a risk by
               | using modern technology".
        
               | AnthonyMouse wrote:
               | "Things are currently bad, therefore give up" isn't
               | satisfactory.
               | 
               | Even if everything is imperfect, some things are more
               | imperfect than others. If each component that makes you
               | vulnerable has a given percent chance of being used
               | against you in practice, you're better off with one than
               | six, even if you're better off with none than one.
        
               | rovr138 wrote:
               | I don't see them giving up anywhere. I see them saying
               | what's the current state of things and changing something
               | that was said above them.
        
               | argomo wrote:
               | Just to add to this: trying to compile an open source
               | Android distro is a tricky proposition that requires
               | trusting several binaries and a huge source tree.
               | 
               | Moreover, having a personal route to digital autonomy is
               | nearly worthless. To protect democracy and freedom,
               | practically all users need to be able to compute
               | securely.
        
               | tedivm wrote:
               | There's a classic Ken Thompson talk about Trust where he
               | shows how a compiler could essentially propagate a bug
               | forward even after the source code for that compiler was
               | cleaned up.
               | 
               | http://cs.bell-labs.co/who/ken/trust.html
               | 
               | It's a fantastic concept.
        
               | agent327 wrote:
               | It's a concept that relies on a great deal of magic to
               | function properly. The binary-only compiler we have must
               | insert code to propagate the infection. To do so it must
               | know it is compiling a compiler, and understand precisely
               | how to affect code generation, without breaking anything.
               | That... feels like an undecidable problem to me.
        
               | adrian_b wrote:
               | On most hardware, the closed source components are
               | optional.
               | 
               | For example the only closed source component that I use
               | is the NVIDIA driver, but I could use Nouveau, with lower
               | performance.
               | 
               | The real problem is caused by the hardware backdoors that
               | cannot be controlled by the operating systems and which
               | prevent the full ownership of the devices, e.g. the
               | System Management Mode of Intel/AMD, the Intel ME and the
               | AMD PSP.
        
               | donkeyd wrote:
               | On most hardware, the closed source components come pre-
               | installed. Just walk into any store and find a phone with
               | a verifiable open source distro.
               | 
               | The real problem is that most people don't have the
               | knowledge to compile a distro and load it onto a phone.
               | Most people don't even know that's a possibility or that
               | the distro on their phone isn't open source.
        
             | breput wrote:
             | You mean                 -if (photo.isCloudSynced &&
             | scanForIllegalContent(photo)){       +if
             | (photo.isCloudSynced & scanForIllegalContent(photo)) {
             | reportUserToPolice();        }
             | 
             | https://old.reddit.com/r/chromeos/comments/onlcus/update_it
             | _...
        
           | donkeyd wrote:
           | > designate your partner as a child
           | 
           | This is way too much work to gain hardly anything. It's just
           | as easy to just log into another device with their iCloud
           | password and literally read everything they send. Less work,
           | more result.
        
           | chinchilla2020 wrote:
           | The reality is that actual child abusers know who they are.
           | They realize that society is after them. They are already
           | paranoid, secretive, people. They are not going to be
           | uploading pictures to the cloud of their child abuse.
           | 
           | And let's not forget the minor detail that this is now public
           | knowledge. It's like telling your teenage son you're going to
           | be searching his closet for marijuana in the future.
        
           | mojzu wrote:
           | If The Verge's article is accurate about how/when the CSAM
           | scanning occurs then I don't have a problem with that, sounds
           | like they're moving the scanning from server to client side,
           | the concerns about false positives seem valid to me but I'm
           | not sure the chance of one occurring has increased over the
           | existing icloud scanning. Scope creep for other content
           | scanning is definitely a possibility though so I hope people
           | keep an eye on that
           | 
           | I'm not a parent but the other child protection features seem
           | like they could definitely be abused by some parents to exert
           | control/pry into their kids private lives. It's a shame that
           | systems have to be designed to prevent abuse by bad people
           | but at Apple's scale it seems like they should have better
           | answers for the concerns being raised
        
             | ksec wrote:
             | If they are only doing it on iCloud what's wrong with
             | continuing that? What was their incentive for having it on
             | Phone?
        
               | pritambaral wrote:
               | They pay for running iCloud; you pay for running your
               | phone.
        
               | browningstreet wrote:
               | Been wondering this myself.
               | 
               | It's a huge move, and a big change in the presumptions of
               | how their platform works.
               | 
               | I'm heavily invested in the Apple ecosystem and it'll
               | take a years work to get off it.
               | 
               | I'm thinking of the prevailing principles of whatever I
               | do best, and one of them is, excise integrated platforms
               | as much as possible.
               | 
               | But most consumers will probably not care, and this road
               | will get paved for miles to come.
        
             | raxxorrax wrote:
             | > I hope people keep an eye on that
             | 
             | This will be the statement they increase the scope the next
             | time.
             | 
             | Hard to imagine that Tim Cook would have scanned Epsteins
             | photos...
        
             | adriancr wrote:
             | Today it does that. Tomorrow who knows...
             | 
             | It would be easy to extend this to scan for 'wrongthink'.
             | 
             | Next logical steps would be to scan for: confidential
             | government documents, piracy, sensitive items, porn in some
             | countries, LGBT content in countries where it's illegal,
             | etc... (and not just on icloud backed up files, everything)
             | 
             | This could come either via Apple selling this as a product
             | or forced by governments...
        
               | noduerme wrote:
               | I give it a month before every Pooh Bear meme ends up
               | part of the hash DB.
        
               | bostonsre wrote:
               | I'd guess more like 6 months, but I agree that it will be
               | trivial for the CCP to make them fall in line by
               | threatening to kick them out of the market. Although...
               | maybe they already have this ability in China.
        
               | greesil wrote:
               | Maybe the CCP is the entire reason they're doing this.
        
               | theonlybutlet wrote:
               | My first thought exactly. How badly do they want to
               | operate in the Chinese market.
        
               | cvwright wrote:
               | First it's just CSAM
               | 
               | Next it's Covid misinformation
               | 
               | Then eventually they're coming for your Bernie memes
        
             | mortenjorck wrote:
             | _> sounds like they 're moving the scanning from server to
             | client side, the concerns about false positives seem valid
             | to me but I'm not sure the chance of one occurring has
             | increased over the existing icloud scanning._
             | 
             | That's the irony in this: This move arguably improves
             | privacy by removing the requirement that images be
             | decrypted on the server to run a check against the NCMEC
             | database. While iCloud Photo Library is of course not E2E,
             | in theory images should no longer have to be decrypted
             | anywhere other than on the client under normal
             | circumstances.
             | 
             | And yet - by moving the check to the client, something that
             | was once a clear distinction has been blurred. I entirely
             | understand (and share) the discomfort around what is
             | essentially a surveillance technology now running on
             | hardware I own rather than on a server I connect to, even
             | if it's the exact same software doing the exact same thing.
             | 
             | Objectively, I see the advantage to Apple's client-side
             | approach. Subjectively, I'm not so sure.
        
               | josephcsible wrote:
               | If Apple has the ability to decrypt my photos on their
               | servers, why do I care whether or not they actually do so
               | today? Either way, the government could hand them a FISA
               | warrant for them all tomorrow.
        
               | judge2020 wrote:
               | If photos become E2EE, then Apple can no longer turn over
               | said photos, while still not completely turning down
               | their CSAM scanning obligations.
        
               | randyrand wrote:
               | If everything is E2EE, I don't see how they _can_ have
               | any scanning obligations.
               | 
               | Google drive does not scan for child porn if the user
               | uploads a user encrypted file. They literally can't.
               | 
               | Personally, I don't think E2EE photos is coming.
        
               | dustyharddrive wrote:
               | That's an interesting way to look at it. Funny how this
               | news can be interpreted as both signaling Apple's
               | interest in E2EE iCloud Photos or weaking their overall
               | privacy stance.
        
               | judge2020 wrote:
               | My issue with my own statement is that we have yet to see
               | plans for E2EE Photos with this in place - if apple had
               | laid out this as their intention on apple.com/child-
               | safety/ it would have been clear-cut.
        
             | mulmen wrote:
             | > It's a shame that systems have to be designed to prevent
             | abuse by bad people but at Apple's scale it seems like they
             | should have better answers for the concerns being raised.
             | 
             | Well the obvious response is that these systems don't
             | _have_ to be designed. Child abuse is a convenient red
             | herring to expand surveillance capabilities. Anyone
             | opposing the capability is branded a child molester. This
             | is the oldest trick in the book.
             | 
             | I mean the capability to spy on your kid can easily be used
             | to abuse them. Apple could very well end up making
             | children's lives worse.
        
               | sharken wrote:
               | It's similar to the German Covid contact-tracing app
               | Luca, which German police is already using for other
               | purposes.
               | 
               | It seems the only way to opt-out is to get out of the
               | Apple ecosystem.
               | 
               | https://www.golem.de/news/hamburg-polizei-nutzt-corona-
               | konta...
               | 
               | https://www.ccc.de/de/updates/2021/luca-app-ccc-fordert-
               | bund...
        
               | mulmen wrote:
               | Luca isn't an Apple app is it? And I thought the system
               | Apple developed with Google had much better privacy
               | guarantees? Although I don't think it was ever actually
               | deployed.
        
             | justinplouffe wrote:
             | The CSAM scanning is still troubling because it implies
             | your own device is running software against your own self-
             | interest. If Apple wanted to get out of legal trouble by
             | not hosting illegal content but still make sure iOS is
             | working in the best legal interest of the phone's user,
             | they'd prevent the upload of the tagged pictures and notify
             | that they refuse to host these particular files. Right now,
             | it seems like the phone will actively be snitching on its
             | owner. I somehow don't have the same problem with them
             | running the scan on their servers since it's machines they
             | own but having the owner's own property work against them
             | sets a bad precedent.
        
               | tommymachine wrote:
               | And it's a bat shit stupid business move
        
               | pcdoodle wrote:
               | I know, I was going to upgrade my 2016 SE to a 12 Mini.
               | Now I'm not interested at all.
        
             | SquishyPanda23 wrote:
             | > sounds like they're moving the scanning from server to
             | client side
             | 
             | That is good, but unless a system like this is fully open
             | source and runs only signed code there really aren't many
             | protections against abuse.
        
               | vineyardmike wrote:
               | And who audits the DB hashes? The code is the easiest
               | part to trust.
        
               | eertami wrote:
               | You don't even need to sneak illegitimate hashes. Just
               | use the next Pegasus-esque zero-day to dump a photo from
               | the hash list on to the phone.
        
               | vineyardmike wrote:
               | Yeah, but adding whatever document/meme/etc that will
               | represent the group you hate and boom you have a way to
               | identify operatives of a politician party.
               | 
               | eg. Anyone with a "Feel the Bern" marketing material ->
               | arrest them under suspicion of CSAM. Search their device
               | and find them as dissidents.
        
           | js2 wrote:
           | > designate your partner as a child.
           | 
           | That's not how it works, unless you control your partner's
           | Apple ID and you lie about their DOB when you create their
           | account.
           | 
           | I created my kids Apple IDs when they were minors and
           | enrolled them in Family Sharing. They are now both over 18
           | and I cannot just designate them as minors. Apple
           | automatically removed my ability to control any aspects of
           | their phones when they turned 18.
           | 
           | > Dads' photos of their kids landing them on a national
           | kiddie porn watch list.
           | 
           | Indeed, false positives is much more worrying. The idea that
           | my phone is spying on my pictures... like, what the hell.
        
             | ljm wrote:
             | When you write that out, the idea of getting 'Apple ID's
             | for your kids doesn't sound that great.
             | 
             | Register your kids with a corporate behemoth! Why not!? Get
             | them hooked on Apple right from childhood, get their entire
             | life in iCloud, and see if they'll ever break out of the
             | walled garden.
        
             | odyssey7 wrote:
             | > That's not how it works, unless you control your
             | partner's Apple ID and you lie about their DOB when you
             | create their account.
             | 
             | Rather than reassuring me, this sounds like an achievable
             | set of steps for an abuser to carry out.
        
               | werber wrote:
               | I recently had a friend stay with me after being abused
               | by their partner. The partner had paid for their phone
               | and account and was using that control to spy on them. I
               | wish that cyber security was taught in a more practical
               | way because it has real world consequences. And like two
               | comments on here and it's now clear as day how this
               | change could be used to perpetuate abuse. I'm not sure
               | what the right solution is, but I wish there was a tech
               | non profit that secured victims of abuse in their
               | communication in an accessible way to non tech people.
        
               | oh_sigh wrote:
               | What forms of abuse will this open up to the prospective
               | abuser that they couldn't do previously?
        
               | vineyardmike wrote:
               | Distrust your spouse/partner isn't cheating? Designate
               | them a minor and let the phone tell you if they sext.
        
               | setr wrote:
               | If you already control their apple account, then you
               | already have access to this information. Your threat
               | model can't be "the user is already pwned" because then
               | everything is vulnerable, always
        
               | AnthonyMouse wrote:
               | The real problem here is that the user can't un-pwn the
               | device, because it's the corporation that has root
               | instead of the user.
               | 
               | To do a factory reset or otherwise get it back into a
               | state where the spyware the abuser installed is not
               | present, the manufacturer requires the authorization of
               | the abuser. If you can't root your own device then you
               | can't e.g. spoof the spyware and have it report what you
               | want it to report instead of what you're actually doing.
        
               | vineyardmike wrote:
               | Well yeah, but this makes the UI better to pwn your
               | victims - now it tells you when they do something instead
               | of you needing to watch for it
        
               | rStar wrote:
               | i wish I could downvote this a million times. if someone
               | has to seize physical control of your phone to see sexts
               | thats one thing. this informs the abuser whenever the
               | sext is sent/received. this feature will lead to violent
               | beatings of victims who share a residence with their
               | abuser. Consider the scenario of sally sexting jim while
               | tom sits in another room of the same home waiting for the
               | text to set him off. in other circumstances, sally would
               | be able to delete her texts, now violent tom will know
               | immediately. Apple has just removed the protection of
               | deleting texts from victims of same residence abusers.
               | 
               | Apple should be ashamed. I see this as Apple paying the
               | tax of doing business in many of the worlds most
               | lucrative markets. Apple has developed this feature to
               | gain access to markets that require this level of
               | surveillance if their citizens.
        
               | JumpCrisscross wrote:
               | > _Consider the scenario of sally sexting jim while tom
               | sits in another room_
               | 
               | Consider Sally sending a picture of a bee that Apple's
               | algo determines with 100% confidence is a breast while
               | Tom sits in another room. One could iterate _ad
               | infinitum_.
        
               | leereeves wrote:
               | More than achievable. Abusers often control their
               | victims' accounts.
        
               | intended wrote:
               | Here's better -
               | 
               | There's a repository built from seized child porn.
               | 
               | Those pictures and videos have hashes. Apple wants to
               | match against those hashes.
               | 
               | That's it.
               | 
               | That's it for now.
        
               | Macha wrote:
               | This is implying all the concerns about possible future
               | uses of this technology are unreasonable slippery slope
               | concerns, but we're on our like fourth or fifth time on
               | this slope and we've slipped down it every previous time,
               | so it's not unreasonable to be concerned
               | 
               | Previous times down this slope:
               | 
               | * UK internet filters for child porn -> opt out filters
               | for regular porn (ISPs now have a list of porn viewers) +
               | mandatory filters for copyright infringment
               | 
               | * Google drive filters for illegal content -> Google
               | driver filters for copyrighted content
               | 
               | * iCloud data is totally protected so it's ok to require
               | an apple account -> iCloud in China run by government
               | controlled data centers without encryption
               | 
               | * Protection against malware is important so Windows
               | defender is mandatory unless you have a third party
               | program -> Windows Defender deletes DeCSS
               | 
               | * Need to protect users against malware, so mobile
               | devices are set up as walled gardens -> Providers use
               | these walled gardens to prevent business models that are
               | bad for them
        
               | intended wrote:
               | The first slippery slope for this was when people made
               | tools to do deep packet inspection and find copyrighted
               | content during the Napster era.
               | 
               | That was the first sin of the internet era.
               | 
               | Discussing slippery slopes does nothing.
               | 
               | Edit: It is frustrating to see where we are going.
               | However - conversations on HN tend to focus on the false
               | positives, and not too much on the actual villains who
               | are doing unspeakable things.
               | 
               | Perhaps people need to hear stories from case workers or
               | people actually dealing with the other side of the coin
               | to better make a call on where the line should be drawn.
        
               | bostik wrote:
               | I don't think anyone here is trying to detract from the
               | horrors and the crimes.
               | 
               | My problem is that these lists _have already been used_
               | for retaliation against valid criticism. Scope creep is
               | real, and in case of this particular list, adding an item
               | is an explicit, global accusation of the creator and /or
               | distributor for being a child molester.
               | 
               | I also wrote some less commonly voiced thoughts
               | yesterday: https://news.ycombinator.com/item?id=28071872
        
               | Freak_NL wrote:
               | How do you prevent photo's from your kids ending up in
               | such a database? Perhaps you mailed grandma a photo of a
               | nude two year old during bath time during a Covid
               | lockdown -- you know, normal parenting stuff. Grandma
               | posted it on Facebook (accidentally, naively, doesn't
               | matter) or someone gained access to it, and it ended up
               | on a seedy image board that caters to that niche. A year
               | later and it's part of the big black box database of
               | hashes and _ping_ , a flag lights up next to your name on
               | Apple's dashboard and local law enforcement is notified.
               | 
               | I don't know how most people feel about this, but even a
               | false positive would seem hazardous. Does that put you on
               | some permanent watch list in the lowest tier? How can you
               | even know? And besides, it's all automated.
               | 
               | We could of course massively shift society towards a no-
               | photo/video policy for our kids (perhaps only kept on a
               | non-internet connected camera and hard drive), and tell
               | grandma to just deal with it (come back after the
               | lockdown granny, if you survive). Some people do.
               | 
               | And don't think that normal family photos won't get
               | classified as CEI. What is titillating for one is
               | another's harmless family photo.
        
               | bostik wrote:
               | Rather than try to rehash the arguments myself, I'll just
               | point you to Matthew Green's detailed takedown: https://t
               | witter.com/matthew_d_green/status/14230910979334266...
               | 
               | But just to highlight _one_ aspect, the list of
               | maintained hashes has a known, non-negligible fraction of
               | false positives.
               | 
               | > _That's it for now._
               | 
               | If this is an attempt at "first they came...", we're not
               | biting.
        
               | cookieswumchorr wrote:
               | exactly, as the IT guy in the family you set up accounts
               | for everybody all the time
        
             | dumpsterdiver wrote:
             | > That's not how it works, unless you control your
             | partner's Apple ID and you lie about their DOB when you
             | create their account.
             | 
             | The most annoying thing about Apple Family sharing is that
             | in order to create accounts for people you _must_ specify
             | that they are under 13 (source:
             | https://www.apple.com/lae/family-sharing) - otherwise the
             | only other option is for your "family member" to link their
             | account to the Apple Family which is under your purview,
             | which understandably many people might be hesitant to do
             | because of privacy concerns (as opposed to logging into the
             | child account on a Windows computer exclusively to listen
             | to Apple Music - which doesn't tie the entire machine to
             | that Apple ID as long as it's not a mac).
             | 
             | And so in my case, I have zero actual family members in my
             | Apple Family (they're more interested in my Netflix family
             | account). It begs the question, why does Apple insist on
             | having people be family members in order to share Apple
             | Music? We have five slots to share, and they get our money
             | either way. They also don't let you remove family members -
             | which may be the original intent for insisting on such a
             | ridiculous thing - as if they're trying to take the moral
             | high ground and guilt trip us for disowning a family member
             | when in fact it simply benefits them when a fallout occurs
             | between non-family members, because there's a good chance
             | that the person in question will stop using the service due
             | to privacy concerns, and that's less traffic for Apple.
             | 
             | It's actually kind of humorous to think that I still have
             | my ex-ex-ex-girlfriend in my Apple Family account, and
             | according to Apple she's 11 now (in reality, she's in her
             | 30s). I can't remove her until another 7 years pass (and
             | even then it's questionable if they'll allow it, because
             | they might insist that I can't divorce my "children"). And
             | honestly, at this point I wouldn't even remove her if I
             | could, she has a newborn baby and a partner now, and I'm
             | happy to provide that account, and I still have two unused
             | slots to give away. I've never been the type of person who
             | has a lot of friends, I have a few friends, and one
             | girlfriend at a time. But the thing is she's never been a
             | music person and I assume that she isn't even using it -
             | and so even if I made a new best friend or two and reached
             | out to her to let her know that I wanted to add them, Apple
             | currently wouldn't let me remove her to make room for those
             | theoretical friends. While I'm a big fan of Apple hardware,
             | it really bothers me that a group of sleazy people sat
             | around a table trying to figure out how to maximize income
             | and minimize network traffic, and this is what they came up
             | with.
        
               | tekknik wrote:
               | Did you ever stop to realize if licensing has anything to
               | do with this? You also lied about someones age when
               | creating their Apple account and continue to provide
               | access to someone outside your family. Call them, and
               | remove them, and then likely they'll ban you for
               | violating the ToS
        
               | dumpsterdiver wrote:
               | Curious, how would licensing affect this? Would the
               | assumption be that everyone resides under the same roof?
               | Because that's not a requirement for being in a family.
        
             | jdhdhdyyy wrote:
             | Parent police the phones of 16 and 17 years old? That's
             | some horrifying over parenting, Britney spears
             | conservatorship level madness. Those kids have no hope in
             | the real world
        
               | bityard wrote:
               | Clearly you are not a parent.
               | 
               | Well, as a parent, I can tell you that some 16/17 year
               | olds are responsible and worthy of the trust that comes
               | with full independence. Others have more social/mental
               | maturing to do yet and need some extra guidance. That's
               | just how it goes.
        
             | donkeyd wrote:
             | > false positives is much more worrying
             | 
             | This is an argument for me to not start using iCloud
             | keychain. If Apple flags my account, I don't want to lose
             | access to literally all my other accounts.
        
           | suizi wrote:
           | Moving the scanning to the client side is clearly an attempt
           | to move towards scanning content which is about to be posted
           | on encrypted services, otherwise they could do it on the
           | server-side, which is "not categorically unprecedented".
        
           | selykg wrote:
           | I feel like you're sensationalizing this a lot.
           | 
           | There's two functions here. Both client side.
           | 
           | First, machine learning to detect potentially inappropriate
           | pictures for children to view. This seems to require parental
           | controls to be on. Optionally it can send a message to the
           | parent when a child purposefully views the image. The image
           | itself is not shared with Apple so this is notification to
           | parents only.
           | 
           | The second part is a list of hashes. So the Photos app will
           | hash images and compare to the list in the database. If it
           | matches then presumably they do something about that. The
           | database is only a list of KNOWN child abuse images
           | circulating.
           | 
           | Now, not to say I like the second part but the first one
           | seems fine. The second is sketchy in that what happens if
           | there's a hash collision. But either way it seems easy enough
           | to clear that one up.
           | 
           | No father is going to be added to some list for their
           | children's photos. Stop with that hyperbole.
        
             | JumpCrisscross wrote:
             | > _the Photos app will hash images and compare to the list
             | in the database. If it matches then presumably they do
             | something about that. The database is only a list of KNOWN
             | child abuse images circulating._
             | 
             | This seems fine as it's (a) being done on iCloud-uploaded
             | photos and (b) replacing a server-side function with a
             | client-side one. If Apple were doing this to locally-stored
             | photos on iCloud-disconnected devices, it would be nuts.
             | Once the tool is built, expanding the database to include
             | any number of other hashes is a much shorter leap than
             | compelling Apple to build the tool.
             | 
             | > _it seems easy enough to clear that one up_
             | 
             | Would it be? One would be starting from the point of a
             | documented suspicion of possession of child pornography.
        
               | selykg wrote:
               | Okay. Keep going with the scare tactics. Clearly you
               | missed the real point of you being incredibly hyperbolic
               | 
               | I'd be happier if Apple wasn't doing this at all. I'm not
               | defending them necessarily but I am calling bullshit on
               | your scare tactics. It's not necessary.
        
               | tekknik wrote:
               | > Would it be? One would be starting from the point of a
               | documented suspicion of possession of child pornography.
               | 
               | Ive actually witnessed someone go through this from
               | someone else getting caught with these types of images
               | and attempting to bring others with him. It's not easy.
               | It took him over a year of his life calling constantly
               | asking when the charges will be dropped. They even image
               | your devices on the spot yet still take them and stuff
               | them in an evidence locker until everything is cleared
               | up. You're essentially an outcast to society while this
               | is pending as well as most people assume if you have
               | police interest related to child pornography you must be
               | guilty.
        
             | 015a wrote:
             | This is Apple installing code on their users' devices with
             | the express intent to harm their customers. That's it! This
             | is inarguable! If this system works as intended, Apple is
             | knowingly selling devices that will harm their customers.
             | We can have the argument as to whether the harm is
             | justified, whether the users _deserved it_. Sure, this only
             | impacts child molesters. That makes it ok?
             | 
             | "But it only impacts iCloud Photos". Valid! So why not run
             | the scanner in iCloud and not on MY PHONE that I paid OVER
             | A THOUSAND DOLLARS for? Because of end-to-end encryption.
             | Apple wants to have their cake and eat it too. They can say
             | they have E2EE, but also give users no way to opt-out of
             | code, running on 100% of the "end" devices in that "end-to-
             | end encryption" system, which subverts the E2EE. A
             | beautiful little system they've created. "E2EE" means
             | different things on Apple devices, for sure!
             | 
             | And you're ignoring (or didn't read) the central, valid
             | point of the EFF article: _Maybe_ you can justify this in
             | the US. Most countries are far, far worse than the US when
             | it comes to privacy and human rights. The technology
             | exists. The policy has been drafted and enacted; Apple is
             | now alright with subverting E2EE. We start with hashes of
             | images of child exploitation. What 's next? Tank man in
             | China? Photos of naked adult women, in conservative parts
             | of the world? A meme criticizing your country's leader? I
             | want to believe that Apple will, AT LEAST, stop at child
             | exploitation, but Apple has already estroyed the faith I
             | held in them, only yesterday, in their fight for privacy as
             | a right.
             | 
             | This isn't an issue you can hold a middleground position
             | on. Encryption doesn't only kinda-sorta work in a half-ass
             | implementation; it doesn't work at all.
        
               | judge2020 wrote:
               | > with the express intent to harm their customers.
               | 
               | This of course gets into 'what even is harm?' since
               | that's a very subjective way of classifying something,
               | especially when you try to do it on behalf of others.
               | 
               | For CSAM you could probably assume that "everyone this
               | code takes action against would consider doing so
               | harmful", but _consequences in general are harmful_ and
               | thus you could make this same argument about anything
               | that tries to prevent crime or catch criminals instead of
               | simply waiting for people to turn themselves in. You harm
               | a burglar when you call for emergency services to
               | apprehend them.
               | 
               | > This isn't an issue you can hold a middleground
               | position on. Encryption doesn't only kinda-sorta work in
               | a half-ass implementation; it doesn't work at all.
               | 
               | This is the exact issue that the U.S. has been entrenched
               | by - thinking that you can't disagree with one thing
               | someone says or does and agree with other things they say
               | or do. You can support Apple deciding to combat CSAM. You
               | can not support Apple for trying to do this client-sided
               | instead of server-sided. You can also support Apple for
               | taking steps towards bringing E2EE to iCloud Photos. You
               | can also not support them bowing to the CCP and giving up
               | Chinese citizens' iCloud data encryption keys to the CCP.
               | This is a middle ground - and just because you
               | financially support Apple by buying an iPhone or in-app
               | purchases doesn't mean you suddenly agree with everything
               | they do. This isn't a new phenomenon - before the
               | internet, we just didn't have the capacity to know, in an
               | instant, the bad parts of the people or companies we
               | interfaced with.
        
               | 015a wrote:
               | You do harm a burglar when you call for emergency
               | services; but the burglar doesn't pay for your security
               | system. And more accurately: an innocent man pays for his
               | neighbor's security system, which has a "one in a
               | trillion" chance of accusing the innocent man of breaking
               | in, totally randomly and without any evidence. Of course,
               | the chances are slim, and he would never be charged with
               | breaking in if it did happen, but would you still take
               | that deal?
               | 
               | I've seen the "right to unreasonable search and seizure"
               | Americans hold quoted a bit during this discussion.
               | Valid, though to be clear, the Constitution doesn't apply
               | for private company products. But more interestingly:
               | what about right against self-incrimination? That's what
               | Apple is pushing here; that by owning an iPhone, you may
               | incriminate yourself, and actually it may end up
               | happening whether you're actually guilty or not.
        
               | judge2020 wrote:
               | Regarding your second paragraph of the legality, Apple
               | doesn't incriminate you even if they send the image off
               | and a image reviewer deems something CSAM. If Apple does
               | file a police report on this evidence or otherwise gives
               | evidence to the police, the police will still have to
               | prove that (A) the images do indeed depict sexually
               | suggestive content involving a minor, and (B) you did not
               | take an affirmative defense under 18 USC 2252A (d) [0],
               | aka. they would have to prove that you had 3 or more
               | actual illegal images and didn't take reasonable steps to
               | destroy the images or immediately report them to law
               | enforcement and given such law enforcement access to said
               | photos.
               | 
               | The biggest issue with this is, of course, that Apple's
               | accusation is most certainly going to be enough evidence
               | to get a search warrant, meaning a search and seizure of
               | all of your hard drives they can find.
               | 
               | 0: https://www.law.cornell.edu/uscode/text/18/2252A#:~:te
               | xt=(d)...
        
               | tekknik wrote:
               | Based off of your A and B there, I think we're about to
               | see a new form of swatting. How many people regularly go
               | through all of their photos? Now if someone pisses
               | someone else off and has physical access to their phone
               | they just need to add 3 pictures to the device with older
               | timestamps and just wait for the inevitable results.
        
               | browningstreet wrote:
               | I believe the account gets disabled..? More of that, no
               | thanks.
        
               | katbyte wrote:
               | there is no e2e encryption of icloud photos or backups,
               | and they never claimed to have that (except for keychain)
               | - the FBI stepped in and prevented them from doing so
               | years ago.
        
               | trangus_1985 wrote:
               | > in their fight for privacy as a right
               | 
               | I kept the lineage phone in my back pocket, confident
               | that it would be a good 4-5 years before they shipped
               | something that violated their claims. I figured, the
               | alternatives would be stable and widespread.
               | 
               | My timing was off.
        
               | simion314 wrote:
               | > So the Photos app will hash images and compare to the
               | list in the database.
               | 
               | I am wondering what hashes are now and will be in this
               | database. Or combine Pegasus exploit , put a few bad
               | images on the journalist/politician iPhone, cleanup the
               | tracks and wait for Apple and FBI destroy the person.
        
               | dmz73 wrote:
               | I think Apple has always installed software on their
               | users' devices with explicit intent to harm their
               | customers. This instance just makes is a little bit more
               | obvious what the harm is but not enough to harm Apple's
               | bottom line. Eventually Apple will do something that will
               | be obvious to everyone but by then it will probably be
               | too late for most people to leave the walled garden
               | (prison).
        
               | concordDance wrote:
               | > Sure, this only impacts child molesters.
               | 
               | Um. No?
               | 
               | I would be very surprised if more than 10% of people in
               | possession of sexual images of under 18s molested (pre-
               | pubecent) children.
        
               | intended wrote:
               | There's a database of known child porn. The hashes of
               | these images are compared with the content on peoples
               | phones.
        
               | smolder wrote:
               | I don't think GP missed that point, and it seems like you
               | missed theirs. Having illegal CSAM and molesting children
               | are two different crimes.
        
             | jliptzin wrote:
             | "easy enough to clear up"
             | 
             | I prefer not having my door busted in at 6 am and my dog
             | shot in the head because of a hash collision
        
               | joering2 wrote:
               | "We are so sorry that we raided your house and blew the
               | whole in your 2 years old baby, but the hash from one of
               | the pictures from your 38,000 photos library, matched our
               | CP entry. Upon further inspection, an honest mistake of
               | an operator was discovered, where our operator instead of
               | uploading real CP, mistakenly uploaded the picture of a
               | clear blue sky".
               | 
               | https://www.aclu.org/gallery/swat-team-blew-hole-2-year-
               | old-...
               | 
               | PS. On a personal note, Apple is done for me. Stick a
               | fork in it. I was ready to upgrade after September
               | especially since I heard touch-ID is coming back and I
               | love my iPhone 8. But sure as hell this sad news means i8
               | is my last Apple device.
        
           | walterbell wrote:
           | Photosync can automatically move photos from iDevices into
           | consolidated NAS, SFTP, cloud or iXpand USB storage,
           | https://photosync-app.com
           | 
           | GoodReader has optional app-level file encryption with a
           | password that is not stored in the iOS keychain. In theory,
           | those encrypted files should be opaque to device backups or
           | local filesystem scanning, unless iOS or malware harvests the
           | key from runtime memory, https://goodreader.com/
        
         | playguardin wrote:
         | You and your morality can suck it.
        
         | TheRealDunkirk wrote:
         | > I hope they recant
         | 
         | This is very much like driving a car through a crowd of
         | protestors. They will slowly, inexorably, eventually push
         | through.
        
         | GeekyBear wrote:
         | > with icloud photos csam, it is also a horrifying precedent
         | 
         | That precedent was set many years ago.
         | 
         | >a man [was] arrested on child pornography charges, after
         | Google tipped off authorities about illegal images found in the
         | Houston suspect's Gmail account.
         | 
         | Microsoft's "PhotoDNA" technology is all about making it so
         | that these specific types of illegal images can be
         | automatically identified by computer programs, not people.
         | 
         | PhotoDNA converts an image into a common black-and-white format
         | and size the image to a uniform size, Microsoft explained last
         | year while announcing its increased efforts at collaborating
         | with Google to combat online child abuse.
         | 
         | https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
        
           | trangus_1985 wrote:
           | cloud versus local device is a massive distinction imo. or
           | maybe im a dinosaur ;)
        
             | d110af5ccf wrote:
             | No, you're not a dinosaur. It is _entirely_ reasonable for
             | a hosting provider not to want certain content on their
             | servers. And it is also quite reasonable to want to
             | automate the process of scanning for it.
             | 
             | My physical device, on the other hand, is (supposed to be)
             | mine and mine alone.
        
         | taurath wrote:
         | If my parents had the feature to be alerted about porn their
         | kid's device while I was a teen they would have sent me to a
         | conversion camp, and that is not an exaggeration.
         | 
         | Apple thinks the appropriate time for queer kids to find
         | themselves is after they turn 18.
        
           | mrtranscendence wrote:
           | If you're just downloading and looking at porn, no problem.
           | It only becomes an issue if you're sharing porn via Messages
           | or storing it in iCloud. And to be fair, I don't think
           | they're alerted to the nature of the pornography, so you
           | might be able to avoid being outed even if you're sharing
           | porn (or having porn shared with you).
           | 
           | Edit: I'm wrong in one respect: if the kid under 13 chooses
           | to send a message with an explicit image despite being warned
           | via notification, the image will be saved to a parental
           | controls section. This won't happen for children >= 13.
        
           | neop1x wrote:
           | Maybe Apple will decrease child abuse cases but increase
           | cases of child suicides..
        
         | 2OEH8eoCRo0 wrote:
         | >While I still use google maps
         | 
         | You can still use Google Maps without an account and
         | "incognito". I wish they'd allow app store usage without an
         | account though- similar to how any Linux package manager works.
        
           | trangus_1985 wrote:
           | That's not really the issue. The issue is that for google
           | maps to work properly, it requires that the Play services are
           | installed. Play services are a massive semi-monolithic blob
           | that requires tight integration with Google's backend, and
           | deep, system-level permissions to operate correctly.
           | 
           | I'm not worried about my search history.
        
             | boring_twenties wrote:
             | Last I checked (about a year ago), the Google Maps app did
             | work with microG (a FOSS reimplementation of Google Play
             | Services).
        
               | trangus_1985 wrote:
               | I use maps on my phone on a regular basis - I would
               | vastly prefer to have something less featured and stable
               | versus hacking the crap out of my phone. But that's good
               | to know.
        
               | d110af5ccf wrote:
               | Have you tried either of these?
               | 
               | https://f-droid.org/en/packages/net.osmand.plus/
               | 
               | https://f-droid.org/en/packages/app.organicmaps/
        
               | andrepd wrote:
               | OsmAnd is an absolutely brilliant map app for android.
               | Very fully featured but also pleasant to use (though I
               | dislike some of the defaults).
        
             | commoner wrote:
             | I've just tested Google Maps on an Android device without
             | Google Play Services or microG. The app works fine,
             | although every time you cold-start it, the app shows an
             | alert (which can be dismissed but not disabled) and a
             | notification (which can be specifically disabled) that
             | Google Play Services is unavailable. On an Android device
             | with microG, Google Maps works without showing the alert or
             | the notification about Google Play Services.
        
             | brundolf wrote:
             | One workaround is to use the mobile web app, which is
             | surprisingly pretty decent for a web app. And because it's
             | a web app, you can even disable things like sharing your
             | location if you want to
        
             | 2OEH8eoCRo0 wrote:
             | Ahhh, gotcha. Did not realize that. Makes sense.
        
             | yosito wrote:
             | There is a Google Maps website that works on mobile. No
             | need for the app.
        
             | techrat wrote:
             | People need to remember that most of Android got moved into
             | Play Services. It was the only way to keep a system
             | relatively up to date when the OEMs won't update the OS
             | itself.
             | 
             | Yeah, it's a dependency... as much as the Google Maps APK
             | needing to run on Android itself.
        
           | opan wrote:
           | In addition to F-Droid, you can get Aurora Store (which is on
           | F-Droid) which lets you use an anonymous login to get at the
           | Play Store. I use it for a couple free software apps that
           | aren't on F-Droid for some reason.
        
             | C19is20 wrote:
             | What are the apps?
        
             | sunshineforever wrote:
             | I also recommend Aurora Store as a complete replacement for
             | the Play store. The one thing is that I've never tried
             | using apps that I paid for on it but it works very well for
             | any free apps. There is an option to use a Google account
             | with Aurora but I've only ever used the anonymous account.
             | 
             | The only slight dowbside is that I haven't figured out how
             | to auto update appd, so your apps will get out of date
             | without you being notified and you have to manually do it.
             | This problem might literally be solved by a simple setting
             | thay I haven't bothered to look for, IDK.
             | 
             | On the plus side it includes all the official play store
             | apps, along side some that aren't allowed by play store.
             | 
             | For examples, Newpipe, the superior replacement YouTube app
             | that isn't allowed on play store due to it subverting
             | advertisements and allowing a few features that are useful
             | for downloading certain things.
        
         | _arvin wrote:
         | I'm really loving fastmail. Thanks for the heads up!
        
         | bambax wrote:
         | > _probably good in the moral sense_
         | 
         | How, how is it even morally good?? Will they start taking
         | pictures of your house to see if you store drugs under your
         | couch? Or cook meth in your kitchen??
         | 
         | What is moral is for society to be in charge of laws and law
         | enforcement. This vigilante behavior by private companies who
         | answer to no one is unjust, tyrannical and just plain crazy.
        
           | tekknik wrote:
           | > Will they start taking pictures of your house to see if you
           | store drugs under your couch? Or cook meth in your kitchen??
           | 
           | How many people have homepods? When will they start listening
           | for illegal activity?
        
             | pjerem wrote:
             | No worries, it's totally local voice recognition ! We'll
             | only send samples when you speak about herbs.
        
         | _red wrote:
         | Yes, my history was Linux 95-04, Mac 04-15, and now back to
         | Linux from 2015 onwards.
         | 
         | Its been clear Tim Cook was going to slowly harm the brand. He
         | was a wonderful COO under a visionary CEO-type, but he holds no
         | particular "Tech Originalist" vision. He's happy to be part of
         | the BigTech aristocracy, and probably feels really at home in
         | the powers it affords him.
         | 
         | Anyone who believes this is "just about the children" is naive.
         | His chinese partners will use this to crack down on "Winnie the
         | Poo" cartoons and the like...before long questioning any Big
         | Pharma product will result in being flagged. Give it 5 years at
         | max.
        
           | ursugardaddy wrote:
           | you make that sound like a bad thing, I'd love to live in a
           | world without child abuse spreading rampant on the internet
           | and not having to suffer though what passes for political
           | speech (memes) these days.
           | 
           | maybe once we detect and stop stuff like this from happening
           | before it gets very bad, we can grow as a society and adjust
           | our forms of punishment accordingly too
        
             | adamrt wrote:
             | Is this a bot comment? Account is two hours old.
             | 
             | You want to not suffer through political memes? And jump to
             | scanning private messages for dissent, by authoritarian
             | governments being okay!?
             | 
             | What?!
        
               | ursugardaddy wrote:
               | No, I'm being serious. technology like this could be very
               | beneficial.
               | 
               | there's a good chance that if we continue to improve
               | surveillance law enforcement agencies and justice
               | departments could begin to focus on rehabilitation and
               | growing a kinder world.
               | 
               | right they are like firemen trying to put out fires after
               | the building has been ruined
               | 
               | if it doesn't work we're doomed anyway, so what's the
               | problem?
        
               | DiggyJohnson wrote:
               | Can you speak at all to the negative aspects of this
               | position?
        
               | empressplay wrote:
               | The problem is what you're describing is literal facism?
        
               | ursugardaddy wrote:
               | that's kind of a stretch, but I hope not.
               | 
               | the way the laws go good or bad is going to be up to
               | everyone
        
               | browningstreet wrote:
               | You lost me with the last sentence.
        
             | Lammy wrote:
             | _Helen Lovejoy voice_ Won 't somebody _please_ think of the
             | children!?
        
             | runjake wrote:
             | Once you give them the power, they'll never willingly hand
             | it back.
        
             | withinboredom wrote:
             | I don't think anyone is arguing that making it harder to
             | abuse children is a bad thing. It's what is required to do
             | so that is the bad thing. It'd be like if someone installed
             | microphones all over every house to report on when you
             | admit that you're guilty to bullying. No one wants
             | bullying, but I doubt you want a microphone recording
             | everything and looking for certain trigger words. Unless
             | you have an Alexa or something, then I guess you probably
             | wouldn't mind that example.
        
               | jcrites wrote:
               | Alexa and iPhones with Siri enabled, and Android phones,
               | are all continuously listening with their microphones for
               | their wake word, unless you've specifically turned the
               | feature off.
               | 
               | The difference is that the Alexa connects to your wifi,
               | so if you wanted to, you could trivially tell if it's
               | communicating when it shouldn't be. When I worked at
               | Amazon, I was given the impression that the system that
               | handles detecting the wake word was implemented in
               | hardware, and the software system that does the real
               | speech recognition doesn't "wake up" or gain access to
               | the audio channel unless the wake word is detected by
               | that hardware system -- and it's very obvious when you've
               | woken it up (the colored ring lights up, it speaks, etc.)
               | 
               | Echo devices also sit in one room. If you're like most
               | people you take your phone _everywhere_ , which means
               | that if it's spying on you, it could literally have a
               | transcript of every word you spoke the entire day, as
               | well as any people you've been around. To make matters
               | worse, it would be difficult to tell if that was
               | happening. Unless you're an uber-hacker who knows how to
               | root an iPhone, or a radio geek who knows enough to
               | monitor their device's cellular transmissions, good luck
               | figuring out whether Siri is listening to and passing on
               | audio that it shouldn't. The problem is that phones have
               | so many apps and responsibilities -- given that they are
               | essentially full computers -- these days that nonstop
               | data transfer on a wifi network from my phone wouldn't be
               | alarming: it might be backing up pictures to a cloud, or
               | syncing the latest version of apps, etc.
               | 
               | I think the dedicated devices like Echo/Alexa are what
               | you should buy if you're the most privacy-sensitive,
               | since they have zero reason to be uploading to the
               | Internet unless you're actively talking to them, and they
               | have zero reason to be downloading unless they're
               | receiving a software patch, which should be very rare.
               | And because they're on your wifi (not cell) you can
               | monitor their network traffic very easily.
        
               | trangus_1985 wrote:
               | It's not unreasonable to expect the speech recognition
               | models to be run locally.
               | 
               | As to the wake word point, I agree. I don't think
               | alexa/siri/etc are currently bad or disrespecting
               | privacy. I actually have a smart home with a voice
               | assistant.
               | 
               | However, my smart home is all local mesh network (zwave
               | and zigbee) based through a FOSS bridge that doesn't talk
               | to the internet. All lights are through smart switches,
               | not bulbs. The end result is such that if the voice
               | assistant service ever pisses me off, I can simply
               | disconnect from it.
               | 
               | If you read my comments in this article, I think I come
               | off as a tin foil hat wearing lunatic, to some degree at
               | least.
               | 
               | But actually, I'm not a super privacy paranoid person.
               | Telemetry, voice recognition datasets, etc... I think
               | those are a reasonable price to pay for free stuff! I
               | just want to have my thumb on the scale and a go-bag
               | packed for when/if these services become "evil" ;)
        
               | chinchilla2020 wrote:
               | You are correct. The model training is pretty intensive
               | and needs to be run remotely in powerful machines.
               | 
               | The final model is just a series of vectors that are
               | typically pretty fast to run on the local machine.
        
         | biztos wrote:
         | I've been thinking about switching my main email to Fastmail
         | from Apple, for portability in case the anti-power-user trend
         | crosses my personal pain threshold.
         | 
         | But if your worry is governments reading your mail, is an email
         | company any safer? I'm sure FM doesn't _want_ to scan your mail
         | for the NSA or its Australian proxy, but do they have a choice?
         | And if they were compelled, would they not be prevented from
         | telling you?
         | 
         | "We respect your privacy" is exactly what Apple has been
         | saying.
        
           | dustyharddrive wrote:
           | I think self-hosting email has too many downsides (spam
           | filtering, for example) to be worth it; I'm more concerned
           | about losing my messages (easily solved with POP or mbox
           | exports while still using a cloud account) than government
           | data sharing. Email is unencrypted in transit anyway, and
           | it's "industry standard" to store it in clear text at each
           | end.
        
           | trangus_1985 wrote:
           | > if your worry is governments reading your mail
           | 
           | complicated. As long as they require a reasonable warrant
           | (ha!), I'm fine. Email is an inherently insecure protocol and
           | ecosystem, anyways.
           | 
           | I haven't used email for communication that I consider to be
           | private for a while - I've moved most, if not all, casual
           | conversation to signal, imessage. Soon, I hope to add
           | something like matrix or mattermost into the mix.
           | 
           | My goal was never to be perfect. My goal is to be able to
           | easily remove myself from an invasive spyware ecosystem, and
           | bring my friends along, with minimal impact.
        
           | neop1x wrote:
           | I have been self-hosting email for 7 years successfully. But
           | it required a physical server in a reputable datacenter,
           | setting up Dovecot, Exim, SpamAssasin, reverse-DNS, SPF,
           | DKIM. It took a bit of time to gain IP reputation but then it
           | has worked flawlessly since. Occasionally some legit mail is
           | flagged as spam or vice versa but it is not worse than any
           | other mail provider. So it can be done! But my first attempts
           | to do that on a VPS failed as IP blocks of VPS providers are
           | often hopelessly blacklisted in major email providers.
        
           | mackrevinack wrote:
           | there's always protonmail which is supposedly e2e, so they
           | shouldn't be able to scan your mail
        
           | vineyardmike wrote:
           | unfortunately, self hosting is a pretty clear alternative.
           | Not much else seems to be.
        
         | rStar wrote:
         | > it is also a horrifying precedent that the device I put my
         | life into is scanning my photos and reporting on bad behavior
         | 
         | apples new customers are the various autocratic regimes that
         | populate the earth. apples customers used to be human beings.
         | there exist many profiteers in mountain view, cupertino menlo
         | and atherton in the service of making our monopolies more
         | capable of subjugating humanity.
        
         | LazyR0B0T wrote:
         | Organic Maps on Fdroid is a really clean osm based map.
        
           | JackGreyhat wrote:
           | Nearly the same as MagicEarth...I use it all the time.
        
           | crocodiletears wrote:
           | Does it let you select from multiple routes? I've been using
           | Pocketmaps, but it only gives you a single option for
           | routing, which can lead to issues in certain contexts
        
           | Sunspark wrote:
           | I'm impressed, it actually has smooth scrolling unlike OsmAnd
           | which is very slow loading tiles in.
           | 
           | Critical points I'd make about Organic Maps, I'd want a lower
           | inertia setting so it scrolls faster, and a different color
           | palette.. they are using muddy tones of green and brown.
        
         | alksjdalkj wrote:
         | Have you found any decent google maps alternatives? I'd love to
         | find something but nothing comes close as far as I've found.
         | Directions that take into account traffic is the big thing that
         | I feel like nobody (other than Apple, MS, etc.) will be able to
         | replicate.
         | 
         | Have you tried using the website? I've had some luck with that
         | on postmarketOS, and it means you don't need to install Play
         | services to use it.
        
           | krobbn wrote:
           | I really like Here WeGo, and it allows you to download maps
           | for specific countries too to have available offline.
        
           | beermonster wrote:
           | OsmAND
        
           | manuelmagic wrote:
           | I'm using since many years HERE Maps https://wego.here.com/
        
           | nickexyz wrote:
           | Organic maps is pretty good:
           | https://github.com/organicmaps/organicmaps
        
         | new_realist wrote:
         | The argument from reactionary HN neckbeards is basically,
         | "can't you see that this _could_ be used for great evil?"
         | 
         | No shit. That's obvious to just about... everyone on the
         | planet. Many things in this world can be used for great evil:
         | knives, gasoline, guns, TNT, cars--even most household items
         | when used with creativity. It is quite impossible to create
         | something which can't be abused in some form. But society still
         | allows them, because it judges that the good outweighs the bad,
         | and systems exist to manage the risk of evil use.
         | 
         | In this case, I have every expectation that this scanning will
         | be auditable, and society will eventually work out most of the
         | imperfections in systems like these, and strike the right
         | balance to make the world a better place.
        
           | [deleted]
        
         | qwerty456127 wrote:
         | > While I still use google maps, I've been trialing out OSM
         | alternatives for a minute.
         | 
         | Is there a way to set up Android to handle shared locations
         | without Google Maps?
         | 
         | Every time someone shares location with me (in Telegram) it
         | displays as a tiny picture and once I click it it says I have
         | to install Google Maps (I use an alternative for actual maps
         | and don't have Google Maps installed). So I end up zooming the
         | picture and then finding the location on the map manually.
        
         | paulcarroty wrote:
         | > Fortunately, my email is on a paid provider
         | 
         | Paid doesn't mean more secure, it's popular mistake.
        
         | samstave wrote:
         | What I am reminded of is all of the now seemingly prophetic
         | writing and story telling in a lot of cyber-punk-dystopian
         | anime about the future of the corporate state, and how mega
         | corps rule EVERY THING.
         | 
         | What I always thought was interesting was that the Police
         | Security Services in Singapore were called "CISCO" -- and you
         | used to see these swat-APV-type vans driving around and armed
         | men with CISCO emblazened on their gear/equip/vehicles...
         | 
         | I always was reminded of Cyberpunk Anime around that.
        
           | m4rtink wrote:
           | Interesting! But actually this is not the singly thing with
           | an "interesting" name in Singapore - well at least as long as
           | you speak Czech. ;-)
           | 
           | You see, the mass transit company in Singapore is handled by
           | the Singapore Municipal Rapid Transit company, abbreviated
           | SMRT. There is also a SMRT corporation
           | (https://en.wikipedia.org/wiki/SMRT_Corporation), SMRT buses,
           | the SMRT abbreviation is heavily used on train, stations,
           | basically everywhere.
           | 
           | Well, in Czech "smrt" means literarily death. So let's say
           | for Czech speakers riding the public transport in Singapore
           | can be a bit unnerving - you stand at a station platform and
           | then a train with "DEATH" written on it in big letters pulls
           | into the station. ;-)
        
             | samstave wrote:
             | Wow. Thanks for that.
             | 
             | Imagine if you were a Czech child who was old enough to
             | read but not old enough to disassociate the locality of the
             | spellings... that would be odd.
        
         | SOMA_BOFH wrote:
         | how does apple protect againat hash collisions?
        
         | threatofrain wrote:
         | What is your home NAS setup like?
        
           | trangus_1985 wrote:
           | Freenas, self-signed tightly-scoped CA installed on all of my
           | devices. 1TBx4 in a small case shoved under the stairs.
           | 
           | tbh, i would vastly prefer to use a cloud based service with
           | local encryption - I'm not super paranoid, just overly
           | principled
        
             | quest88 wrote:
             | What do you use to sync phone photos to your NAS? I like
             | Google Photos' smartness, but I also want my photos on my
             | Synology NAS.
        
               | antgiant wrote:
               | I personally am a fan of Mylio for that.
               | https://mylio.com/
        
               | neolog wrote:
               | Syncthing
        
             | voltaireodactyl wrote:
             | If you haven't already heard of it, cryptomator might be
             | just what you're after.
        
             | _arvin wrote:
             | Take a look at https://internxt.com. Been using them for a
             | couple weeks and am incredibly impressed. Great team, great
             | product, just great everything. It was exactly what I was
             | looking for
        
         | lcfcjs wrote:
         | Found the paedo.
        
         | forgingahead wrote:
         | This can happen only because whenever any slippery-slope action
         | was taken previously, there is an army of apologists and
         | "explainers" who rush to "correct" your instinctive aversion to
         | these changes. It's always the same - the initial comment is
         | seemingly kind, yet with an underlying menace, and if you
         | continue to express opposition, they change tack to being
         | extremely aggressive and rude.
         | 
         | See the comment threads around this topic, and look back to
         | other related events (notably the tech giants censoring people
         | "for the betterment of society" in the past 12 months).
         | 
         | Boiling a frog may happen slowly, but the water continues to
         | heat up even if we pretend it doesn't. Very disappointed with
         | this action by Apple.
        
           | raxxorrax wrote:
           | This is typical obedient behavior. Some abused spouses get
           | through lengths to come up with excuses for their partners.
           | Since I don't own an iOS device, I don't really care about
           | this specific instance.
           | 
           | But I don't want these people normalizing deep surveillance
           | and fear that I have to get rid of my OSX devices when this
           | trend continues.
        
         | vineyardmike wrote:
         | > as a queer kid, I was terrified of my parents finding out
         | 
         | I think many queer people have a completely different idea of
         | the concept of "why do you want to hide if you're not doing
         | anything wrong" and the desire to stay private. Especially
         | since anything sexual and related to queerness is way more
         | aggressively policed than hetero-normative counterparts.
         | 
         | Anything "think of children" always has a second order affect
         | of damaging queer people because lots of people still think of
         | queerness as dangerous to children.
         | 
         | It is beyond likely that lots of this monitoring will catch
         | legal/safe queer content - especially the parental-controls
         | focused monitoring (as opposed to the gov'ment db of illegal
         | content)
        
           | heavyset_go wrote:
           | > _Anything "think of children" always has a second order
           | affect of damaging queer people because lots of people still
           | think of queerness as dangerous to children._
           | 
           | For example, YouTube does this with some LGBT content.
           | YouTube has demonitized LGBT content and placed it in
           | restricted mode, which screens for "potentially mature"
           | content[1][2].
           | 
           | YouTube also shadowbans the content[1], preventing it from
           | showing up in search results at all.
           | 
           | From here[1]:
           | 
           | > _Filmmaker Sal Bardo started noticing something strange:
           | the views for his short film Sam, which tells the story of a
           | transgender child, had started dipping. Confused, he looked
           | at the other videos on his channel. All but one of them had
           | been placed in restricted mode -- an optional mode that
           | screens "potentially mature" content -- without YouTube
           | informing him. In July of that year, most of them were also
           | demonetized. One of the videos that had been restricted was a
           | trailer for one of his short films; another was an It Gets
           | Better video aimed at LGBTQ youth. Sam had been shadow-
           | banned, meaning that users couldn't search for it on YouTube.
           | None of the videos were sexually explicit or profane._
           | 
           | There are more examples like that here[2].
           | 
           | [1] https://www.rollingstone.com/culture/culture-
           | features/lgbtq-...
           | 
           | [2] https://www.washingtonpost.com/technology/2019/08/14/yout
           | ube...
        
             | vineyardmike wrote:
             | And it's not just YouTube. Most platforms are at least
             | partially guilty of this.
             | 
             | Then there is tumblr, which is all but dead - explicitly
             | for a "think of the children" concern from apple.
        
               | zimpenfish wrote:
               | > Then there is tumblr, which is all but dead -
               | explicitly for a "think of the children" concern from
               | apple.
               | 
               | Tumblr had their porn ban decided 6 months before the
               | snafu with the app and CSAM. It's nothing to do with
               | Apple.
        
           | userbinator wrote:
           | _Especially since anything sexual and related to queerness is
           | way more aggressively policed than hetero-normative
           | counterparts._
           | 
           | I find it intensely ironic that Apple's CEO is openly gay.
        
             | vineyardmike wrote:
             | Irony is rarely good in the news
        
             | fennecfoxy wrote:
             | I get the impression that he's at least somewhat asexual.
        
         | cle wrote:
         | Unfortunately with SafetyNet, I feel like an investment into
         | Android is also a losing proposition...I can only anticipate
         | being slowly cut off from the Android app ecosystem as more
         | apps onboard with attestation.
         | 
         | We've collectively handed control of our personal computing
         | devices over to Apple and Google. I fear the long-term
         | consequences of that will not be positive...
        
           | trangus_1985 wrote:
           | I don't think it's implausible that I carry around a phone
           | that has mail, contacts, calendars, photos, and private chat
           | on it. And then, have a second, older phone that has like
           | Instagram and mobile games. It's tragic.
        
             | sodality2 wrote:
             | Unfortunately a big bulk of the data they profit off of is
             | simply the ads and on-platform communication and behavior.
             | Doesn't really matter if you use a different device if you
             | still use the platform. Sure, it's slightly better, but it
             | really isn't a silver bullet if you're still using it. And
             | this is coming from someone who does this already.
        
               | trangus_1985 wrote:
               | I don't really mind if they make a profit off of the free
               | things I use.
               | 
               | What I mind is when my personal life, the stuff that
               | _actually_ matters, is being monitored or has a backdoor
               | that allows ANY third party easy access to monitor it.
        
           | heavyset_go wrote:
           | > _We 've collectively handed control of our personal
           | computing devices over to Apple and Google_
           | 
           | Hey now, the operating system and app distribution cartels
           | include Microsoft, too.
        
             | MiddleEndian wrote:
             | Windows, for all the shit they do to antagonize users, does
             | let you choose what programs you install on your PC without
             | forcing you to use an app store.
        
               | qball wrote:
               | And yet, they've been guilty of uninstalling programs
               | they don't like before
               | (https://www.zdnet.com/article/microsoft-windows-10-can-
               | now-a...).
               | 
               | The only real way to avoid this is to sandbox all Windows
               | and macOS systems and only run them from Linux hosts, but
               | you're still taking a performance hit when you do this
               | and sometimes usability just isn't up to par with the
               | other two.
        
               | heavyset_go wrote:
               | Correct me if I'm wrong, but i believe Microsoft enforces
               | certificate checks, meaning you need to buy certificates
               | regularly and be in good standing with the company so the
               | apps you signed with the certificates will run on Windows
               | without issues. I believe Defender can act similarly to
               | Apple's Gatekeeper.
        
               | vladvasiliu wrote:
               | I think on Windows it's not only based on a missing
               | signature. I sometimes get the "this file may damage your
               | computer" message. There's also an "ignore" button hidden
               | below a "more" button, but it in the end it lets you use
               | it. But it doesn't always happen. [0]
               | 
               | It's not very user friendly, but it might be a bit more
               | intuitive than apple's special dance of click right ->
               | open to bypass said controls.
               | 
               | ---
               | 
               | [0] For example, the Prometheus exporter for windows x64
               | is not signed and doesn't trigger the alert. I can
               | download it (no alert) click open (no alert) and it runs
               | . The x32 version does have a "this may damage your
               | computer" alert in the browser (edge).
               | 
               | https://github.com/prometheus-
               | community/windows_exporter/rel...
        
               | josephcsible wrote:
               | Well, for the most part at least. Remember Windows RT and
               | Windows 10 S?
        
               | rutthenut wrote:
               | Yep, got a tablet here that is now a brick as Windows RT
               | is no use
        
               | somebody_amzn wrote:
               | Oddly enough, Windows RT still gets security updates
               | until 2023.
               | 
               | But with it stuck on IE11...
        
             | cle wrote:
             | Totally agree, I was thinking more within the context of
             | mobile phones.
        
               | Apocryphon wrote:
               | To be fair, even if Blackberry was still a viable
               | platform, RIM is no better and in some ways even worse.
               | 
               | https://news.ycombinator.com/item?id=1649963
        
               | heavyset_go wrote:
               | WebOS and Maemo both were much more open, the latter even
               | had apt.
        
             | lenkite wrote:
             | What ? I can install anything I like on windows. I cannot
             | on my iPhone.
        
           | techrat wrote:
           | Loosing sight of the forest for this one tree.
           | 
           | 1) Google doesn't release devices without unlockable
           | bootloaders. They have always been transparent in allowing
           | people to unlock their Nexus and Pixels. Nexus was for
           | developers, Pixels are geared towards the end user. Nothing
           | changed with regards to the bootloaders.
           | 
           | 2) Google uses Coreboot for their ChromeOS devices. Again,
           | you couldn't get more open than that if you wanted to buy a
           | Chromebook and install something else on it.
           | 
           | 3) To this day, app sideloading on Android remains an option.
           | They've even made it easier for third party app stores to
           | automatically update apps with 12.
           | 
           | 4) AOSP. Sure, it doesn't have all the bells and whistles as
           | the latest and greatest packaged up skin and OS release, but
           | all of the features that matter within Android, especially if
           | you're going to de-Google yourself, are still there.
           | 
           | Any one of those points, but consider all four, and I have
           | trouble understanding why people think REEEEEEEE Google.
           | 
           | So you can't play with one ball in the garden (SafetyNet),
           | you've still got the rest of the toys. That's a compromise
           | I'm willing to accept in order to be able to do what I want
           | to and how I want to do it. (Eg, Rooting or third party
           | roms.)
           | 
           | If you don't like what they do on their mobile OS, there's
           | _nothing_ that Google is doing to lock you into a Walled
           | Garden to where the only option you have is to completely
           | give up what you 're used to...
           | 
           | ...Unlike Apple. Not one iOS device has been granted an
           | unlockable bootloader. Ever.
        
             | josephcsible wrote:
             | > Google doesn't release devices without unlockable
             | bootloaders. They have always been transparent in allowing
             | people to unlock their Nexus and Pixels.
             | 
             | True but misleading. If you unlock your bootloader, you can
             | no longer use a lot of apps, including Snapchat, Netflix,
             | Pokemon Go, Super Mario Run, Android Pay, and most banking
             | apps. And before you say this isn't Google's fault, know
             | that they provide the SafetyNet API, which has no
             | legitimate, ethical use cases, and is what allows all of
             | the aforementioned apps to detect whether the device has
             | been modified, even if the owner doesn't want that.
        
               | kuratkull wrote:
               | I have an unlocked Xiaomi loaded with Lineage OS and
               | Magisk, all the apps work - banking, Netflix, you name
               | it.
        
               | d110af5ccf wrote:
               | That is likely to change in the near future. Hardware
               | attestation of bootloader state is increasingly
               | available. This is currently bypassed by pretending to be
               | an older device that doesn't possess that capability. As
               | long as device bootloaders continue to differentiate
               | between stock and custom OS signing keys it won't be
               | possible to bypass SafetyNet.
        
               | commoner wrote:
               | > most banking apps
               | 
               | This really depends on the apps. I have used over 10
               | banking apps on an Android phone with an unlocked
               | bootloader without ever encountering any issues. On a
               | device rooted using Magisk, the MagiskHide masking
               | feature successfully bypasses the apps' root checks in my
               | experience.
        
               | josephcsible wrote:
               | > On a device rooted using Magisk, the MagiskHide masking
               | feature successfully bypasses the apps' root checks in my
               | experience.
               | 
               | Sure, the protection currently isn't bulletproof. But
               | wait until it becomes mandatory for TrustZone to
               | participate in the attestation.
        
               | commoner wrote:
               | You're right that more advanced forms of hardware
               | attestation would defeat the masking if Google eventually
               | implements them.
               | 
               | I'm hoping that Microsoft's support for Android apps and
               | integration with Amazon Appstore in Windows 11 will hedge
               | against Google's SafetyNet enforcement by making an
               | alternative Android ecosystem (with fewer Google
               | dependencies) more viable. Apps that require SafetyNet
               | would most likely not work on Windows 11.
        
               | tekknik wrote:
               | > I have used over 10 banking apps on an Android phone
               | with an unlocked bootloader without ever encountering any
               | issues.
               | 
               | You have 10 accounts at different banks? I thought I was
               | bad with 4
        
               | commoner wrote:
               | Well, this also includes credit cards. In some countries,
               | unused and barely used credit lines improve one's credit
               | score.
        
               | tekknik wrote:
               | Ah yea, I forgot about unused credit lines. I used to
               | close those until I checked my credit one day after
               | closing on of the older ones.
        
               | tjbiddle wrote:
               | Obviously anecdotal, but literally none of those examples
               | I care to use on my phone anyway. Overtime, my phone has
               | just become a glorified camera with some messaging
               | features.
        
               | [deleted]
        
               | BenjiWiebe wrote:
               | I've used banking apps and Google pay on my rooted
               | unlocked phone for several years now. True, I'm still on
               | Android 9, so perhaps it will be worse when I upgrade.
               | 
               | Using Magisk and Magisk Hide. Though oddly enough, none
               | of my banking/credit card apps make an issue of being
               | rooted, so they're not even in the Magisk Hide list.
        
             | shbooms wrote:
             | "1) Google doesn't release devices without unlockable
             | bootloaders. They have always been transparent in allowing
             | people to unlock their Nexus and Pixels. Nexus was for
             | developers, Pixels are geared towards the end user. Nothing
             | changed with regards to the bootloaders."
             | 
             | This is not accurate. Pixels that come from Verizon have
             | bootloaders that cannot be fully unlocked.
        
               | dheera wrote:
               | That's because Verizon doesn't want you using a
               | discounted phone with another carrier. If they let you
               | unlock your phone, you could flash a stock radio and
               | ditch Verizon for Google Fi or AT&T. Different issue at
               | play.
               | 
               | As long as you buy a Pixel directly from Google or one of
               | a few authorized resellers, it is unlockable. (I
               | recommend B&H, they help you legally evade the sales
               | tax.) You can also use a Pixel you buy from Google with
               | Verizon.
        
               | [deleted]
        
               | techrat wrote:
               | Not to nitpick here, but there is no way any device you
               | buy from Verizon is discounted, regardless of what they
               | advertise. Everyone pays _full_ price for any device they
               | get on contract or payment plan.
               | 
               | Back when contract pricing was a more regular thing, I
               | ended up doing the math on the plan rate after I
               | requested for the device contract subsidy to be removed
               | as I didn't want to upgrade the device. I had a Droid DNA
               | at the time.
               | 
               | The monthly rate dropped by $25 just to keep the same
               | device. (Nevermind that I had to ASK for them to not
               | continue to charge me the extra $25/mo after 2 years)
               | 
               | $25 a month for 24 months is $600.
               | 
               | The device on contract was $199.
               | 
               | Full retail price if you didn't opt in for a 2 year
               | contract when getting it? $699.
               | 
               | So I ended up paying an extra $100 for the device than if
               | I had just bought it outright.
               | 
               | Even if the offerings/terms are different now... Verizon,
               | regardless of how they market anything, absolutely makes
               | you pay full price (and then some) for the device you get
               | at 'discount.'
               | 
               | It's funny now that we're seeing people being able to
               | BYOD to Verizon these days and AT&T is the one engaging
               | in aggressive whitelisting.
        
               | gowld wrote:
               | Even if it's not discounted, it's on a deferred payment
               | plan, so VZ doesn't want to let you steal the phone.
               | 
               | And 10%/yr simple interest is a reasonable price for a
               | retail loan.
        
               | tommymachine wrote:
               | $100 on $699 is a little more than 10%...
        
               | d110af5ccf wrote:
               | Other carriers will provide a bootloader unlock code to
               | you on request once the device is paid off. As far as I
               | know, Verizon refuses to do so under any circumstances
               | for any device.
        
               | h4waii wrote:
               | Carriers will NOT provide a bootloader unlock, they have
               | no access to that, only the OEM does.
               | 
               | Carriers might provide a network unlock -- these are 2
               | VERY different things.
        
               | techrat wrote:
               | > Pixels that come from Verizon
               | 
               | > Verizon
               | 
               | again.
               | 
               | > Verizon
               | 
               | Google didn't sell you that device. Verizon did.
        
             | heavyset_go wrote:
             | SafetyNet also exists to prevent people from running
             | Android apps on platforms other than Android. You can't use
             | SafetyNet-enabled apps on Anbox, which is what SailfishOS
             | uses as their Android compatibility layer, nor on
             | emulators.
             | 
             | If you wanted to do a WSL but for Android, SafetyNet
             | guarantees many apps won't work.
             | 
             | It also puts alternative Linux-based mobile operating
             | systems, like SailfishOS or postmarketOS, at a disadvantage
             | because they won't be able to run certain Android apps for
             | no real technical reason other than the protection of
             | Google's money firehouse.
        
             | Zak wrote:
             | Safetynet is becoming a problem, and the trend shows to
             | signs of slowing down.
             | 
             | I shouldn't have to choose between keeping full control
             | over my device and being able to use it to access the
             | modern world.
        
               | NotPractical wrote:
               | For instance: The _McDonald 's_ app uses SafetyNet and
               | won't run on an unlocked device.[1] Google doesn't place
               | any restrictions on which types of apps can use
               | SafetyNet. Banking apps tend to use it, but so do an
               | increasing number of apps that clearly shouldn't need it.
               | 
               | (For the record, I don't think SafetyNet should exist at
               | all, but if Google is pretending it's for the user's
               | security and not just to allow developers to make it
               | harder to reverse engineer their fast food apps, they
               | should at least set some boundaries.)
               | 
               | It's frustrating that Google has fostered an ecosystem
               | where not all "Android apps" work on vanilla Android.
               | 
               | [1]
               | https://twitter.com/topjohnwu/status/1277683005843111936
        
               | Zak wrote:
               | I think a system to verify the integrity of the operating
               | system and make the user aware of any changes is a Good
               | Thing. Of course, the user should be in control of what
               | signing keys are trusted and who else gets access to that
               | information.
               | 
               | Instead, what Google has done is allowed app developers
               | to check that the user isn't doing anything surprising -
               | especially unprofitable things like blocking ads or
               | spoofing tracking data. Since Google profits from ads and
               | tracking, I must assume a significant part of their
               | motivation is to make unprofitable behavior inconvenient
               | enough most people won't do it.
        
             | ryukafalz wrote:
             | > all of the features that matter within Android,
             | especially if you're going to de-Google yourself, are still
             | there
             | 
             | Except, uh, GPS. Even for third party navigation apps.
             | 
             | Yes, I know about microG, but the fact that there had to be
             | a third party reimplementation of what should be a standard
             | system API is still a problem.
        
               | commoner wrote:
               | > Except, uh, GPS. Even for third party navigation apps.
               | 
               | AOSP does support GPS without needing any additional
               | software, but does not have built-in support for Wi-Fi
               | and cell tower triangulation. As you mentioned,
               | UnifiedNlp (bundled with microG) optionally supports
               | triangulation using a variety of location providers
               | (including offline and online choices) for a faster
               | location lock.
        
               | gowld wrote:
               | Yes, it's true you can't get everything you want for free
               | with zero effort. Life is not fair.
        
               | fragileone wrote:
               | Agreed, it's shitty of Google to have moved so much
               | functionality into it's proprietary Play Services. Push
               | Notifications API being in it bothers me even more.
               | Unfortunately until Linux mobile operating systems catch
               | up in functionality though I'm going to stick with
               | GrapheneOS.
        
       | fortran77 wrote:
       | What's to stop a malicious person from sending a prohibited image
       | to an unsuspecting person, and causing the target to get into
       | legal trouble for which there is no legal defense ("strict
       | liability" for possession).
        
       | xyst wrote:
       | If this project goes live, I would drop Apple in a heart beat.
        
         | balozi wrote:
         | Does it matter if the project goes live? Once the company's
         | attitude towards user privacy and customer concerns has been
         | revealed, what's there left to hang onto?
        
       | Shank wrote:
       | I really love the EFF, but I also believe the immediate backlash
       | is (relatively) daft. There is a potential for abuse of this
       | system, but consider the following too:
       | 
       | 1. PhotoDNA is already scanning content from Google Photos and a
       | whole host of other service providers.
       | 
       | 2. Apple is obviously under pressure to follow suit, but they
       | developed an on-device system, recruited mathematicians to
       | analyze it, and published the results, as well as one in-house
       | proof and one independent proof showing the cryptographic
       | integrity of the system.
       | 
       | 3. Nobody, and I mean nobody, is going to successfully convince
       | the general public that a tool designed to stop the spread of
       | CSAM is a "bad thing" unless they can show concrete examples of
       | the abuse.
       | 
       | For one and two: given the two options, would you rather that
       | Apple implement serverside scanning, in the clear, or go with the
       | on-device route? If we assume a law was passed to require
       | serverside scanning (which could very well happen), what would
       | that do to privacy?
       | 
       | For three: It's an extremely common trope to say that people do
       | things to "save the children." Well, that's still true. Arguing
       | against a CSAM scanning tool, which is technically more privacy
       | preserving than alternatives from other cloud providers, is an
       | extremely uphill battle. The biggest claim here is that the
       | detection tool _could_ be abused against people. And that very
       | well may be possible! But the whole existence of NCMEC is
       | predicated on stopping the active and real danger of child sex
       | exploitation. We know with certainty this is a problem. Compared
       | to a certainty of child sex abuse, the hypothetical risk from
       | such a system is practically laughable to most people.
       | 
       | So, I think again, the backlash is daft. It's been about two days
       | of the announcement being public (leaks). The underlying
       | mathematics behind the system has barely been published [0]. It
       | looks like the EFF rushed to make a statement here, and in doing
       | so, it doesn't look like they took the time to analyze the
       | cryptography system, to consider the attacks against it, or to
       | consider possible motivations and outcomes. Maybe they did, and
       | they had advanced access to the material. But it doesn't look
       | like it, and in the court of public opinion, optics are
       | everything.
       | 
       | [0]: https://www.apple.com/child-
       | safety/pdf/Alternative_Security_...
        
         | api wrote:
         | (2) is important. Apple put effort into making this at least
         | somewhat privacy-respecting, while the other players just scan
         | everything with no limit at all. They also scan everything for
         | any purpose including marketing, political profiling, etc.
         | 
         | Apple remains the most privacy respecting major vendor. The
         | only way to do better is fully open software and open hardware.
        
         | echelon wrote:
         | > There is a potential for abuse of this system, but consider
         | the following too
         | 
         | > I think again, the backlash is daft.
         | 
         | Don't apologize for this bullshit! Don't let your love of brand
         | trump the reality of what's going on here.
         | 
         | Machinery is being put in place to detect what files are on
         | your supposedly secure device. Someone has the reins and
         | promises not to use it for anything other than "protecting the
         | children".
         | 
         | How many election cycles or generations does it take to change
         | to an unfavorable climate where this is now a tool of great
         | asymmetrical power to use against the public?
         | 
         | What happens when the powers that be see that you downloaded
         | labor union materials, documents from Wikileaks, or other files
         | that implicate you as a risk?
         | 
         | Perhaps a content hash on your phone puts you in a flagged
         | bucket where you get pat downs at the airport, increased
         | surveillance, etc.
         | 
         | The only position to take here is a full rebuke of Apple.
         | 
         | edit: Apple apologists are taking a downright scary position
         | now. I suppose the company has taken a full 180 from their 1984
         | ad centerpiece. But that's okay, right, because Apple is a part
         | of your identity and it's beyond reproach?
         | 
         | edit 2: It's nominally iCloud only (a key feature of the
         | device/ecosystem), but that means having to turn off a lot of
         | settings. One foot in the door...
         | 
         | edit 3: Please don't be complicit in allowing this to happen.
         | Don't apologize or rationalize. This is only a first step. We
         | warned that adtech and monitoring and abuse of open source were
         | coming for years, and we were right. We're telling you - loudly
         | - that this will begin a trend of further erosion of privacy
         | and liberty.
        
           | artimaeis wrote:
           | It's not doing any sort of scanning of your photos while
           | they're just sitting on your device. The CSAM scanning only
           | occurs when uploading photos to iCloud, and only to the
           | photos being uploaded.
           | 
           | Source (pdf): https://www.apple.com/child-
           | safety/pdf/CSAM_Detection_Techni...
        
             | lovelyviking wrote:
             | >only occurs when uploading photos to iCloud, and only to
             | the photos being uploaded.
             | 
             | Or so they say? Did you see the source code?
             | 
             | Anyway that's not the issue.
             | 
             | The issue is what they are going to do next without telling
             | you.
             | 
             | The issue is installation of _spyware engine_ , not it's
             | current usage.
        
             | echelon wrote:
             | Don't forget: this was _leaked_. Apple may never have told
             | us. They could have released some of the features without
             | letting on about the scanning.
        
               | artimaeis wrote:
               | Was it? From what I can tell they published this in a
               | general media release today. If it was leaked before
               | today I seem to have missed it.
        
               | echelon wrote:
               | It was leaked on Twitter yesterday and followed up by a
               | major newspaper article with more details. The surprise
               | created a lot of anger.
               | 
               | Today was Apple's first press release about it.
               | 
               | Apple certainly didn't want it to be revealed like this,
               | and they might have kept the pieces not related to child
               | privacy hush hush.
        
             | pseudalopex wrote:
             | > It's not doing any sort of scanning of your photos while
             | they're just sitting on your device.
             | 
             | Yet. The point is the new system makes it feasible.
        
               | artimaeis wrote:
               | It's always feasible that they put a change in that does
               | something users don't want and invades hardware privacy.
               | But this update is not an instance of that. It's simply
               | changing how we interact with their SaaS. We're no
               | further down some slippery slope.
        
         | cblconfederate wrote:
         | What is the point of E2EE vs TLS/SSL based encryption?
        
         | randcraw wrote:
         | You presume Apple and the DoJ will implement this with human
         | beings at each step. They won't. Both parties will automate as
         | much of this clandestine search as possible. With time, the
         | external visibility and oversight of this practice will fade,
         | and with it, any motivation to confirm fair and accurate
         | matches. Welcome to the sloppiness inherent in clandestine law
         | enforcement intel gathering.
         | 
         | As with all politically-motivated initiatives that boldly
         | violate the Constitution (consider the FISA Court, and its
         | rubber stamp approval of 100% of the secret warrants put before
         | it), the use and abuse of this system will go largely
         | underground, like FISA, and its utility will slowly degrade due
         | to lack of oversight. In time, even bad matches will log the
         | IDs of both parties in databases that label them as potential
         | sexual predators.
         | 
         | Believe it. That's how modern computer-based gov't intel works.
         | Like most law enforcement policy recommendation systems,
         | Apple's initial match algorithm will never be assessed for
         | accuracy, nor be accountable for being wrong at least 10% of
         | the time. In time it will be replaced by other third party
         | screening software that will be even more poorly written and
         | overseen. That's just what law enforcement does.
         | 
         | I've personally seen people suffer this kind of gov't abuse and
         | neglect as a result of clueless automated law enforcement
         | initiatives after 9-1-1. I don't welcome more, nor the gradual
         | and willful tossing of everyone's basic Constitutional rights
         | that Apple's practice portends.
         | 
         | The damages to personal liberty that are inherent in conducting
         | secret searches without cause or oversight is exactly why the
         | Fourth Amendment requires a warrant before conducting a search.
         | NOW is the time to disabuse your sense of 'daftness'; not years
         | from now, after the Fourth and Fifth Amendments become
         | irreversibly passe. Or should I say, 'daft'?
        
         | avnigo wrote:
         | I'd be interested to see what any Apple executives would
         | respond to the concerns in interviews, but I don't expect Apple
         | to issue a press release on the concerns.
        
         | vorpalhex wrote:
         | Who verifies CSAM databases? Is there a way to verify the CSAM
         | hashlist hasn't been tampered with and additional hashes
         | inserted?
         | 
         | Would it be ok to use this approach to stop "terrorism"? Are
         | you ok with both Biden and Trump defining that list?
        
         | feanaro wrote:
         | > that a tool designed to stop the spread of CSAM is a "bad
         | thing"
         | 
         | It's certainly said to be designed to do it, but have you seen
         | concerns raised in the other thread
         | (https://news.ycombinator.com/item?id=28068741)? There have
         | been reports from some commenters of the NCMEC database
         | containing unobjectionable photos because they were merely
         | _found in a context alongside some CSAM_.
         | 
         | Who audits these databases? Where is the oversight to guarantee
         | only appropriate content is included? They are famously opaque
         | because the very viewing of the content is illegal. So how can
         | we know that they contain what they are purported to contain?
         | 
         | This is overreach.
        
           | shuckles wrote:
           | That's a problem with NCMEC, not Apple's proposal today.
           | Furthermore, if it were an actual problem, it would've
           | already manifested with the numerous current users of
           | PhotoDNA which includes Facebook and Google. I don't think
           | the database of known CSAM content includes photos that
           | cannot be visually recognized as child abuse.
        
             | tsimionescu wrote:
             | Why do you not think that? As far as I understand, there is
             | no procedure for reviewing the contents, it is simply a
             | database that law enforcement vouches is full of bad
             | images.
        
               | shuckles wrote:
               | NCMEC, not law enforcement, produces a list of embeddings
               | of known images of child abuse. Facebook and Google run
               | all photos uploaded to their platforms against this list.
               | Those which match are manually reviewed and if confirmed
               | to depict such scenes, are reported to CyberTip. If the
               | list had a ton of false positives, you think they
               | wouldn't notice that their human reviewers were spending
               | a lot of time looking at pictures of the sky?
        
               | suizi wrote:
               | It's well-known that this algorithm doesn't have a
               | perfect matching rate. It'd be easy to presume that any
               | false positives are not erroneously tagged images, but
               | the error rate of the underlying algorithm, if all the
               | images were tagged correctly. Who would know?
               | 
               | IIRC Wired reported the algorithm "PhotoDNA" worked
               | around 99% of the time a number of years ago, however
               | newer algorithms may be fuzzier. This is not the same
               | algorithm. And even "PhotoDNA" _appears_ to change over
               | time.
               | 
               | I doubt reviewers of such content are at a liberty to
               | discuss what they see or don't with anyone here. Standard
               | confidentiality agreements.
        
             | cwkoss wrote:
             | > it would've already manifested
             | 
             | How do we know it hasn't? Maybe the people who have seen
             | these situations have received to national security letters
             | that claim to prevent them from even talking to a lawyer
             | about the non-CSAM classified images they've seen in the
             | course of the investigation?
        
           | oh_sigh wrote:
           | "Reports from commenters" = unsubstantiated speculation.
           | Weird how no one was able to specifically state any
           | information about these unobjectionable photos except for a
           | theoretical mechanism for them to find their way into the
           | database.
        
             | feanaro wrote:
             | Some _have_ been able to give specific information about
             | these photos: https://news.ycombinator.com/item?id=28071047
             | 
             | Yes, of course, it's "unsubstantiated speculation", but how
             | do you suppose the speculation be substantiated if the
             | entire process and database are not available for audit?
             | That's exactly the problem with this.
        
           | neop1x wrote:
           | >> Who audits these databases?
           | 
           | Maybe pedophiles working for those companies.
        
           | Shank wrote:
           | > Who audits these databases? Where is the oversight to
           | guarantee only appropriate content is included? They are
           | famously opaque because the very viewing of the content is
           | illegal. So how can we know that they contain what they are
           | purported to contain?
           | 
           | I wholeheartedly agree: there is an audit question here too.
           | The contents of the database are by far the most dangerous
           | part of this equation, malicious or not, targeted or not. I
           | don't like the privacy implications about this, nor the
           | potential for abuse. I would love to see some kind of way to
           | audit the database, or ensure that it's only used "for good."
           | I just don't know what that system is, and I know that
           | PhotoDNA is already in use on other cloud providers.
           | 
           | Matthew Green's ongoing analysis [0] is really worth keeping
           | an eye on. For example, there's a good question: can you just
           | scan against a different database for different people? These
           | are the right questions given what we have right now.
           | 
           | [0]: https://twitter.com/matthew_d_green/status/1423378285468
           | 2091...
        
             | shuckles wrote:
             | Matt should read the release before live tweeting FUD. The
             | database is shipped in the iOS image, per the overview, so
             | targeting users is not an issue (roughly).
        
               | pushrax wrote:
               | Is the database frozen or can they push out updates
               | independently of iOS updates? If not, targeting
               | individual users definitely doesn't seem possible unless
               | you control OS signing.
        
               | shuckles wrote:
               | The database is shipped in the iOS image.
        
               | pushrax wrote:
               | That's what you wrote originally - and to me it doesn't
               | indicate whether it can also be updated from other
               | sources or not.
               | 
               | Lots of content is shipped in the iOS image but can
               | update independently.
        
               | shuckles wrote:
               | The technical summary provides a lot of detail. I don't
               | think Apple would omit remote update functionality from
               | it if such capability existed, especially since database
               | poisoning is a real risk to this type of program. I'm
               | comfortable with interpreting the lack of evidence as
               | evidence of absence of such a mechanism. Explicit
               | clarification would certainly help though, but my
               | original point stands: there is positive evidence in the
               | docs which the FUD tweets don't engage with.
               | 
               | In particular, I'm referencing the figure which says that
               | the database of CSAM hashes is "Blinded and embedded"
               | into the client device. That does not sound like an asset
               | the system remotely updates.
        
               | marshf wrote:
               | Do you not see any scenerio where the CIA/OGA inserts a
               | hash into the database to specially target one person or
               | a group of people?
        
               | shuckles wrote:
               | I agree database poisoning is a legitimate threat!
               | Including the database in an iOS release (so it can't be
               | targeted and updated out of band) mitigates it somewhat.
               | At the end of the day, though, more should be done to
               | make NCMEC's database transparent and trustworthy. And
               | other databases too, if Apple decides to ship country-
               | specific blacklists.
        
               | feanaro wrote:
               | I personally don't believe this process can be made to be
               | trustworthy enough while still serving its stated
               | purpose. It will always remain opaque enough that it
               | could and will be used to violate civil rights.
        
               | pseudalopex wrote:
               | You should understand the difference between a protocol
               | and an easy to change implementation detail before
               | throwing around words like FUD.
        
               | shuckles wrote:
               | Has Matt redacted any of the FUD from his tweets last
               | night which aren't true given the published details from
               | today? For example, his claim that the method is
               | vulnerable to black box attacks from GANs isn't
               | applicable to the protocol because the attacker can't
               | access model outputs.
               | 
               | Furthermore, if "an easy to change implementation detail"
               | in your threat model is anything which could be changed
               | by iOS update, you should've stopped using iPhone about
               | 14 years ago.
        
               | [deleted]
        
             | cwkoss wrote:
             | Do the people authorized to query the database have access
             | to view its contents?
             | 
             | How many people in the world can certify that it only
             | contains CSAM? I wonder if there are images of US troops
             | committing warcrimes, or politicians doing cocaine, or
             | honeypot intel op blackmail images in there too. Lots of
             | powerful people would love to have an early warning system
             | for leaks of embarrassing non-CSAM images.
        
         | indymike wrote:
         | > backlash is daft
         | 
         | Fighting to preserve a freedom is not daft, even if it is David
         | vs. Goliath's bigger, meaner brother and his friends.
        
         | [deleted]
        
         | throwaway888abc wrote:
         | 1. was new to me.
         | 
         | TIL - (2014) PhotoDNA Lets Google, FB and Others Hunt Down
         | Child Pornography Without Looking at Your Photos
         | 
         | https://petapixel.com/2014/08/08/photodna-lets-google-facebo...
        
         | shivak wrote:
         | > recruited mathematicians to analyze it, and published the
         | results, as well as one in-house proof and one independent
         | proof showing the cryptographic integrity of the system.
         | 
         | Apple employs cryptographers, but they are not necessarily
         | acting in your interest. Case in point: their use of private
         | set intersection, to preserve privacy..of law enforcement, not
         | users. Their less technical summary:
         | 
         | > _Instead of scanning images in the cloud, the system performs
         | on-device matching using a database of known CSAM image hashes
         | provided by NCMEC and other child safety organizations. Apple
         | further transforms this database into an unreadable set of
         | hashes that is securely stored on users' devices._
         | 
         | > _Before an image is stored in iCloud Photos, an on-device
         | matching process is performed for that image against the known
         | CSAM hashes. This matching process is powered by a
         | cryptographic technology called private set intersection.._
         | 
         | The matching is performed on device, so the user's privacy
         | isn't at stake. But, thanks to PSI and the hash preprocessing,
         | the user doesn't know what law enforcement is looking for.
        
           | xondono wrote:
           | Well, it'd be kind of dumb to make the mistake of building a
           | system to stop child pornography only to have it become the
           | biggest distributor of CP photos in history
        
             | shivak wrote:
             | Those images are hashed, not transmitted in original
             | format. On top of that, PSI prevents you from learning
             | those hashes, or how many there are. So you can't tell if
             | the database contains the hash of, say, tank-man.jpg.
             | 
             | I understand why this shielding is necessary for the system
             | to work. My point is the crypto is being used to protect
             | law enforcement, not the user.
        
               | xondono wrote:
               | And my point is that the only way to provide visibility
               | over what is being looked without distributing the
               | material would be to implement some type of ZKP
        
         | wayneftw wrote:
         | This is an abuse my property rights. The device is my property
         | and this activity will be using my CPU, battery time and my
         | network bandwidth. That's the abuse right there.
         | 
         | They should just use their own computers to do this stuff.
        
           | 8note wrote:
           | You chose an apple device because apple knows what's best for
           | you.
           | 
           | This is part of the integrated experience
        
           | samatman wrote:
           | Photos is just an app.
           | 
           | You can use another photo app, link it to another cloud
           | provider, and be free of the burden.
           | 
           | If you use Photos, you're along for the ride, and you've
           | consented to whatever it does.
           | 
           | You don't get a line-item veto on code you choose to run,
           | that's never been how it works.
           | 
           | For what it's worth, I'm basically with the EFF on this: it
           | looks like the thin end of a wedge, it sucks and I'm not
           | happy about it.
           | 
           | But being histrionic doesn't help anything.
        
             | zionic wrote:
             | No it's not, it's the entire OS.
        
             | jimbob45 wrote:
             | I don't know how true this is. I don't see any way to block
             | Photos from viewing the files on this device and I see no
             | reason that it can't read files from my other apps.
        
             | wayneftw wrote:
             | Histrionics? Did I even use any adjectives?
             | 
             | > it looks like the thin end of a wedge, it sucks and I'm
             | not happy about it.
             | 
             | But you just said that you can disable it. So, you can be
             | upset about a _possible_ future but I 'm the one being
             | melodramatic? That's a good one! You chose to run the OS.
             | So by your logic that gives you no right to complain
             | either.
             | 
             | I do understand that I can disable iCloud for my photos and
             | I will do that. We'll see how long that lasts until they
             | decide to tie the next feature and the next one and the
             | next one to something that I don't like. Because that's how
             | this works. Every time they do something that they know
             | people won't like, they simply make it so that you lose
             | access to something else if you decide to stand up for your
             | principles.
        
           | jdavis703 wrote:
           | Then you have two choices, disable iCloud photo backups or
           | don't upgrade to iOS 15. There are plenty of arguments
           | against Apple's scheme, but this isn't one of them.
        
             | wayneftw wrote:
             | It's the best argument. Every other argument is based on a
             | slippery slope what-if scenario.
             | 
             | Also, I pay for iCloud... but they're not paying me for
             | using my phone and bandwidth. I never agreed to that.
             | 
             | They can't just pull the rug out from me after we already
             | have an agreement. I mean they can, because I probably
             | "agreed" to it and some fine print garbage EULA, but those
             | fall apart in a court of law.
        
       | Sunspark wrote:
       | This is going to do wonders for Apple's marketshare once the
       | teenagers realize that Apple is going to be turning them in to
       | the police.
       | 
       | Teens are not stupid. They'll eventually clue-in that big brother
       | is watching and won't appreciate it. They'll start by using other
       | messengers instead of imessage and then eventually leaving the
       | ecosystem for Android or whatever else comes down the pike in the
       | future.
        
         | r00fus wrote:
         | Apple's definition of "child" is 13yo or younger. So by the
         | time they're more likely to be complaining about this feature,
         | they will be aged out.
         | 
         | I'd like to get verification but that hopefully means your
         | scenario is unlikely.
        
       | Calvin02 wrote:
       | I think the issue is that what the tech community sees as privacy
       | is different than what the general public thinks of as privacy.
       | 
       | Apple, very astutely, understands that difference and exploited
       | the latter to differentiate its phones from its main competitor:
       | cheap(er) android phones.
       | 
       | Apple didn't want the phones to be commoditized, like personal
       | computers before it. And "privacy" is something that you can't
       | commoditize. Once you own that association, it is hard to fight
       | against it.
       | 
       | Apple also understands that the general public will support its
       | anti child exploitation and the public will not see this as a
       | violation of privacy.
        
       | contingencies wrote:
       | Never buying another Apple product.
        
       | [deleted]
        
       | sadness3 wrote:
       | For me, this crosses a line. There should be no need to "strike a
       | balance" with authorities wanting what are essentially
       | unwarranted searches. The right balance is, "fuck off".
       | 
       | I'm looking into privacy phones for the first time and will be
       | switching.
        
       | etempleton wrote:
       | I think this is probably the reasonable and responsible thing for
       | Apple to do as a company, even if it it goes against their
       | privacy ethos. Honestly they probably have been advised by their
       | own lawyers that this is the only way to cover themselves and
       | protect shareholder value.
       | 
       | The question will be if Apple will bend to requests to leverage
       | this for other reasons less noble than the protection of
       | children. Apple has a lot of power to say no right now, but they
       | might not always have that power in the future.
        
       | websites2023 wrote:
       | Apple's battle is against Surveillance Capitalism, not against
       | state-level surveillance. In fact, there is no publicly traded
       | company that is against state-level surveillance. It's important
       | not to confuse the two.
       | 
       | Think of it this way: If you want to hide from companies, choose
       | Apple. If you want to hide from the US Government, choose open
       | source.
       | 
       | But if your threat model really does include the US government or
       | some other similarly capable adversary, you are well and truly
       | fucked already. The state-level apparatus for spying on folks
       | through metadata and traffic interception is now mode than a
       | decade old.
        
         | krrrh wrote:
         | The problem is that as governments gain access to new
         | technological capabilities and exploit crises to acquire more
         | emergency powers, increasingly large numbers of peoples' threat
         | models begin to include government.
         | 
         | The best hopes against a population-wide Chinese-style social
         | credit system being implemented in the US remain constitutional
         | and cultural, but the more architectural help we get from
         | technology the better. "Code is law" is still a valid
         | observation.
        
           | websites2023 wrote:
           | The US has a rather weak central government. The Chinese-
           | style social credit system isn't necessary, because private
           | corporations already do it. Scanning your ID when you return
           | things, advertising profiles, etc.
        
         | [deleted]
        
         | tablespoon wrote:
         | > Think of it this way: If you want to hide from companies,
         | choose Apple. If you want to hide from the US Government,
         | choose open source.
         | 
         | It's not just the US government: they've been cooperating with
         | the PRC government as well (e.g. iCloud in China runs on
         | servers owned by a state-owned company, and apparently China
         | rejected the HSM Apple was using elsewhere, so they designed
         | one specifically for China). Apple has some deniability there,
         | but I personally wouldn't be surprised if China could get any
         | data from them that it wanted.
         | 
         | https://www.nytimes.com/2021/05/17/technology/apple-china-ce...
        
           | websites2023 wrote:
           | Both the US government and Chinese government can get
           | whatever they want from both iCloud and IMessage. Best not to
           | use it for anything that could make you a target of theirs.
        
       | tomxor wrote:
       | I keep thinking, It's like they are _trying_ to be the most
       | ironic company in history...
       | 
       | But then I have to remind myself, the old Apple is long gone, the
       | new Apple is a completely different beast, with a very different
       | concept of what it is marketing.
        
         | amelius wrote:
         | It's the RDF. People still think of Apple as the Old Apple. The
         | rebellious company that stood for creative freedom. The maker
         | of tools that work _for_ the user, not _against_ the user.
        
       | jra_samba wrote:
       | Sorry Apple fans, but you have been living in the very definition
       | of "The Hotel California".
       | 
       | Apple has altered the deal. Pray they do not alter it any
       | further.
       | 
       | Now you have to live with the consequences of convenience.
        
       | endisneigh wrote:
       | Unless the entire stack you're using is audited and open source
       | this sort of thing is inevitable.
       | 
       | As far as this is concerned, seems like if you don't use iMessage
       | or iCloud you're safe for now.
        
         | _red wrote:
         | >don't use iMessage
         | 
         | 1. Send someone you hate a message with cartoon making fun of
         | tyrant-president.
         | 
         | 2. That person is now on a list.
         | 
         | Its swatting-as-a-service.
        
           | ezfe wrote:
           | If you read the article, you'd understand that among ALL the
           | issues, this is not one:
           | 
           | - Photos scanning in Messages is on-device only (no reporting
           | to govt.) and doesn't turn on unless you're an adult who
           | turns it on for a minor via Family Sharing controls. - iCloud
           | Photos scanning doesn't take effect unless you save the photo
           | and it's already in a database of flagged photos. So in your
           | scenario, you'd have to save the photo received from the
           | unknown number to get flagged.
        
             | _red wrote:
             | >So in your scenario, you'd have to save the photo received
             | from the unknown number to get flagged.
             | 
             | Whew! I was worried there for a minute. Maybe for extra
             | safety I could say "SIRI I DISAVOW OF THIS MESSAGE!"??
        
               | bingidingi wrote:
               | would you not report unsolicited child porn to the FBI
               | anyway?
        
               | samatman wrote:
               | Y'know, I have no idea what I'd do in this situation and
               | I really hope I'll never find out.
               | 
               | If a kilo of heroin just showed up in the back seat of my
               | car, I'd throw it out the window and try not to think
               | about it. I certainly wouldn't bring it to the police,
               | because _mere possession is a serious crime_.
               | 
               | CP is the same way, except it comes with a nice audit
               | trail which could sink me even if I delete it
               | immediately. Do I risk that, or do I risk the FBI
               | deciding I'm a Person of Interest because I reported the
               | incident in good faith?
               | 
               | There are no good choices there.
        
               | layoutIfNeeded wrote:
               | _Don't talk to the police._
               | 
               | And that includes the FBI.
        
               | tsimionescu wrote:
               | The scan doesn't detect child porn, it detects photos in
               | the CSAM database. The two may or may not be same thing,
               | now it in the future.
        
             | lijogdfljk wrote:
             | I'm confused - the article explicitly states this scenario
             | - minus the swatting.
             | 
             | Ie unless you're replying to purely the swatting part, the
             | article seems to support this. Specifically a prediction
             | that governments will creep on legally requiring Apple to
             | push custom classifiers:
             | 
             | > Apple's changes would enable such screening, takedown,
             | and reporting in its end-to-end messaging. The abuse cases
             | are easy to imagine: governments that outlaw homosexuality
             | might require the classifier to be trained to restrict
             | apparent LGBTQ+ content, or an authoritarian regime might
             | demand the classifier be able to spot popular satirical
             | images or protest flyers.
        
               | ezfe wrote:
               | That sentence is wrong. It simply isn't accurate of the
               | current system. It relies on future changes to the
               | system, not just changes to a database.
               | 
               | The iMessage feature is not a database comparison system,
               | it's to keep kids from getting/receiving nudes
               | unexpectedly - and it works based on classifying those
               | images.
               | 
               | I don't dispute this is a slippery slope - one could
               | imagine that a government requires Apple to modify it's
               | classification system. However, that would presumably
               | require a software update since it happens on device.
        
               | arvinsim wrote:
               | So are you prepared to never update your device?
        
               | xondono wrote:
               | That refers to the icloud scanning, the idea being that
               | if the hash database contains propaganda, people
               | uploading that propaganda to icloud could get reported by
               | their own device.
        
             | arihant wrote:
             | Didn't apple also announce a feature for iOS 15 where
             | iMessage photos are somehow automatically collected and
             | shown in iCloud? A way to reduced hassle of creating shared
             | albums. So with that, I think all users of iCloud photos
             | are under risk here.
        
         | ncw96 wrote:
         | > As far as this is concerned, seems like if you don't use
         | iMessage or iCloud you're safe for now.
         | 
         | Yes, this is correct. The Messages feature only applies to
         | children under 18 who are in an iCloud Family, and the photo
         | library feature only applies if you are using iCloud Photos.
        
           | withinboredom wrote:
           | I'm fairly certain the age is different per region and
           | hopefully tied to the age of consent (in this particular
           | case).
        
             | rootusrootus wrote:
             | I don't think it has anything to do with age. It has
             | everything to do with you adding the phone to your family
             | under settings and declaring that it belongs to a child.
             | You control the definition of child.
        
               | jdavis703 wrote:
               | I could imagine an abusive partner enabling this to make
               | sure their partner isn't sexting other people. Given the
               | pushback for AirTags I'm surprised people aren't more
               | concerned.
        
               | endisneigh wrote:
               | You're misunderstanding what this is if this is an actual
               | concern of yours.
        
               | jdavis703 wrote:
               | I'm not sure I'm misunderstanding. This is another
               | feature that allows someone with access to another
               | person's phone to enable stalkerware like features.
        
               | rootusrootus wrote:
               | Anyone 13 or older can remove themselves from a family
               | sharing group. The only exception is if screen time is
               | enabled and enforced for their device.
               | 
               | Frankly, if you have an abusive partner with physical
               | control over you and a willingness to do this, the fact
               | that Apple supports this technology is the _least_ of
               | your problems.
        
               | xondono wrote:
               | Except this would require consent of the abused partner
               | when creating the account to set an age <13yo.
               | 
               | You can't ser this to other accounts on you family
               | remotely.
        
           | system2 wrote:
           | Ha ha. They have a fully functional spying software installed
           | on the phone and the government will stop at these
           | restrictions?
        
             | Draken93 wrote:
             | Oh come on, you really think thats their big plan?
             | Announcing the scanning SW in public and then abuse it? If
             | they want to to illegal spying they do it right. And
             | without a second Snowden you will not hear about it.
        
           | josh_today wrote:
           | Would artificially inflating every child's age to 18+
           | eliminate the iMessage problem
        
             | latexr wrote:
             | Ending of fourth paragraph:
             | 
             | > This feature can be turned on or off by parents.
        
         | [deleted]
        
       | didibus wrote:
       | I have a question, does this mean that Apple will have a way to
       | decrypt photos in iCloud?
       | 
       | It seems this can then be a security risk, since Apple could be
       | breached and they'd have the means to server side decrypt things.
       | 
       | If it was simply that client side end to end encryption can be
       | turned on/off based on if the account is a child account or not
       | (or as a configuration for parental control) that be different.
       | 
       | As just a config, then I mean the slippery slope always existed,
       | Apple could always just be forced into changing the settings of
       | what gets end to end encrypted and when.
       | 
       | But if this means that all photos are sent unencrypted to Apple
       | at some point, or sent to Apple in a way they can decrypt, then
       | it does open the door to your photos not being securely stored
       | and attackers being able to steal them. That seems a bit of an
       | issue.
        
         | hu3 wrote:
         | I hate to break it to you but Apple backtracked from their plan
         | to e2e encrypt iCloud backups. Allegedly after being pressured
         | by FBI: https://www.bbc.com/news/technology-51207744
         | 
         | They have the encryption key that allows them to read their
         | customer data.
        
       | Animats wrote:
       | Is Apple under some legal pressure to do this? Is there some kind
       | of secret deal here: "put in spyware and we back off on
       | antitrust?"
        
         | fragileone wrote:
         | For years now congressmen have said stuff along the lines of
         | "exceptional access to encrypted content for law enforcement"
         | ie a backdoor. This is Apple pre-empting any more litigation
         | like Australia, Germany and Ireland's recent privacy violating
         | laws so that governments can just ask Apple to add XYZ
         | prohibited content to their client-side scanner.
        
       | edison112358 wrote:
       | "This means that when the features are rolled out, a version of
       | the NCMEC CSAM database will be uploaded onto every single
       | iPhone."
       | 
       | So every iPhone will now host the explicit images from the
       | National Center for Missing & Exploited Children database.
        
         | spiznnx wrote:
         | The database contains perceptual hashes, not images.
        
         | pgoggijr wrote:
         | No, they will host the hashes computed from those images.
        
         | hartator wrote:
         | Yes, everyone in jail! It's probably just the md5 or something
         | like that, but I don't like it either.
        
         | joshstrange wrote:
         | > So every iPhone will now host the explicit images from the
         | National Center for Missing & Exploited Children database.
         | 
         | It's hashes, not the images themselves.
        
           | cblconfederate wrote:
           | And how did the user end up with the hashes? He hashed the
           | original images which he then deleted, your honor!
           | 
           | BTW this is going to be a major target for smearing people
           | that the US doens't like
        
             | joshstrange wrote:
             | I'm sorry but this is the most ridiculous thing I've read
             | today. Hashes have never and probably will never be used
             | "smear" someone the US doesn't like. We can speculate about
             | them planting evidence but trying to prosecute based on
             | hashes baked into the OS used by millions? That's absurd.
        
         | kevincox wrote:
         | I'm pretty sure this is a non-tech way of saying "a machine
         | learning model" or other parameters which is not a particularly
         | useful form of this database.
        
         | artimaeis wrote:
         | > No user receives any CSAM photo, not even in encrypted form.
         | Users receive a data structure of blinded fingerprints of
         | photos in the CSAM database. Users cannot recover these
         | fingerprints and therefore cannot use them to identify which
         | photos are in the CSAM database.
         | 
         | Source (PDF): https://www.apple.com/child-
         | safety/pdf/Technical_Assessment_...
        
         | zionic wrote:
         | How long until a hacker uses ML to generate collisions against
         | those hashes?
        
           | outworlder wrote:
           | For what purpose? A collision doesn't mean that you found the
           | source images. Not even close.
        
             | __david__ wrote:
             | Find collisions, spam the colliding photos to people you
             | don't like, watch the mayhem unfold.
        
             | acdha wrote:
             | With a broader rollout to all accounts and simply scanning
             | in iMessage rather than photos there's one possible
             | scenario if you could generate images which were plausibly
             | real photos: spam them to someone before an election, let
             | friendly law enforcement talk about the investigation, and
             | let them discover how hard it is to prove that you didn't
             | delete the original image which was used to generate the
             | fingerprint. Variations abound: target that teacher who
             | gave you a bad grade, etc. The idea would be credibility
             | laundering: "Apple flagged their phone" sounds more like
             | there's something there than, say, a leak to the tabloids
             | or a police investigation run by a political rival.
             | 
             | This is technically possible now but requires you to
             | actually have access to seriously illegal material. A
             | feasible collision process would make it a lot easier for
             | someone to avoid having something which could directly
             | result in a jail sentence.
        
             | octopoc wrote:
             | So you can upload the colliding images to iCloud and get
             | yourself reported for having child porn. Then after the law
             | comes down on you, you can prove that you didn't ever have
             | child porn. And you can sue Apple for libel, falsely
             | reporting a crime, whatever else they did. It would be a
             | clever bit of tech activism.
        
             | bingidingi wrote:
             | I guess in theory you could poison the well by widely
             | sharing many false positives?
        
             | ursugardaddy wrote:
             | Improved swatting, it's going to all make for a badass
             | october surprise next election
        
           | bjustin wrote:
           | There is a minimum number of hash matches required, then
           | images are made available to Apple who then manually checks
           | that they are CSAM material and not just collisions. That's
           | what the 9to5Mac story about this says:
           | https://9to5mac.com/2021/08/05/apple-announces-new-
           | protectio...
        
       | strogonoff wrote:
       | If Mallory gets a lawful citizen Bob to download a completely
       | innocuous looking but perceptual-CSAM-hash-matching image to his
       | phone, what happens to Bob? I imagine the following options:
       | 
       | - Apple sends Bob's info to law enforcement; Bob is swatted or
       | his life is destroyed in some other way. Worst, but most likely
       | outcome.
       | 
       | - An Apple employee (or an outsourced contractor) reviews the
       | photo, comparing it to CSAM source image sample used for the
       | hash. Only if the image matches according to human vision, Bob is
       | swatted. This requires there to be some sort of database of CSAM
       | source images, which strikes me as unlikely.
       | 
       | - An Apple employee or a contractor reviews the image for abuse
       | without comparing it to CSAM source, using own subjective
       | judgement. Better, but implies Apple employees could technically
       | SWAT Apple users.
        
         | strogonoff wrote:
         | From BBC's article:
         | 
         | > Apple says that it will manually review each report to
         | confirm there is a match. It can then take steps to disable a
         | user's account and report to law enforcement.
         | 
         | So at least it's the last option.
        
         | bitexploder wrote:
         | Do we know that they are using perceptual hashing? I am curious
         | about the details of the hash database they are comparing
         | against, but I assumed perceptual hashing would be pretty
         | fraught with edge cases and false positives.
         | 
         | e: It is definitely not a strict/cryptographic hash algorithm:
         | "Apple says NeuralHash tries to ensure that identical and
         | visually similar images -- such as cropped or edited images --
         | result in the same hash." They are calling it "NeuralHash" --
         | https://techcrunch.com/2021/08/05/apple-icloud-photos-scanni...
        
         | anonuser123456 wrote:
         | Downloading an image to your phone is different than uploading
         | it to iCloud.
         | 
         | Downloaded images are not uploaded to iCloud w/out user
         | intervention.
        
           | 8note wrote:
           | Today, sure. That's technically very easy to change
        
           | strogonoff wrote:
           | Presuming iCloud Photos is enabled by Bob, an unsuspecting
           | citizen, all downloaded images are synced to iCloud either
           | right away or next time on Wi-Fi, depending on settings.
        
             | pzo wrote:
             | True. On top of that People should now start worrying if
             | some person on whatsapp/telegram group will share some
             | illegal file.
             | 
             | I have been part of few public hiking/travelling
             | whatsapp/telegram groups. Many times I mute those group
             | because I don't want to be notified and distracted with
             | every single message. All those photos someone share in any
             | whatsapp group will end up in your iphone 'Recent' album
             | automatically even if you muted some group many months ago
             | and didn't check at all.
        
               | strogonoff wrote:
               | Well, at least this feature could be disabled in WhatsApp
               | settings as far as I know.
               | 
               | Either way, since my original post I've read that the way
               | Apple does it is by reviewing the offending material in-
               | house first before notifying any third party (like the
               | government), meaning it wouldn't be as easy to SWAT a
               | random person just like that.
        
         | [deleted]
        
       | walterbell wrote:
       | Now that we know iPhones have the ability to perform frame-level,
       | on-device PhotoDNA hashing of videos and photos, could the same
       | infrastructure be used to identify media files which are
       | attempting to exploit the long list of buffer overflows that
       | Apple has patched in their image libraries, as recently as
       | 14.7.1?
       | 
       | This would be super useful for iPhone security, e.g. incoming
       | files could be scanned for attempting to use (closed) exploits,
       | when the user can easily associate a malicious media file with
       | the message sender or origin app/site.
       | 
       | On jailbroken devices (e.g. iPhone 7 and earlier with unpatchable
       | boot ROMs), is there a Metasploit equivalent for iOS, which
       | aggregates PoCs for public exploits?
       | 
       | A related question: will PhotoDNA hashing take place continuously
       | or in batch, e.g. overnight? How will associated Battery/Power
       | usage be accounted, e.g. attributed to generic "System"
       | components or itemized separately? If the former, does that
       | create class-action legal exposure for a post-sale change in
       | device "fitness for purpose"?
        
       | citboin wrote:
       | All of my hardware is outdated so I was about to make the jump to
       | Apple all across the board. Now I'm probably going to dive into
       | the deep end and go into FOSS full throttle. I'm going to
       | investigate Linux OEM vendors tonigh. The only one that I know of
       | is System 76. Are there any Linux based iPad competitors?
        
         | bishoprook2 wrote:
         | That's a great question. I keep looking for Linux tablets but
         | not much joy so far. The Pinetab is unavailable and pretty
         | slow.
         | 
         | If I had to guess, I'd hunt around for a Windows tablet that
         | someone had good luck running Linux on. Maybe a Surface Pro.
        
         | physicles wrote:
         | Thinkpads also run Linux very well. I've got an X1 Carbon 7th
         | gen running Pop!_OS, and everything on the machine works,
         | including the fancy microphones on top (haven't tried the
         | fingerprint reader though).
        
       | literallyaduck wrote:
       | It is okay to use the back door when we want to find people:
       | 
       | being terrorists
       | 
       | exploiting children
       | 
       | who are not vaccinated
       | 
       | use the wrong politically correct language
       | 
       | anything else we don't like
        
       | Drblessing wrote:
       | Use signal y'all
        
         | system2 wrote:
         | Anyone who thinks the apps they will use make any difference is
         | super naive. They are literally installing a trojan in the
         | phone.
        
       | young_unixer wrote:
       | Lately, I've been on the fence about open source software, and
       | I've been tempted by propietary programs. Mainly because FOSS is
       | much less polished than commercial closed-source software, and I
       | care about polish. I even contemplated buying an Apple M1 at some
       | point.
       | 
       | But now I'm reminded of how fucking awful and hostile Apple and
       | other companies can be. I'm once again 100% convinced that free
       | software is the only way to go, even if I have to endure using
       | software with ugly UIs and bad UX. It will be worth it just not
       | to have to use software written by these assholes.
       | 
       | Stallman was right.
        
         | mulmen wrote:
         | Give as much money to your favorite open source project as you
         | would have to Apple for the M1. Polish costs money but it
         | doesn't have to cost freedom.
        
       | slaymaker1907 wrote:
       | I'd be surprised if this goes through as is since you can't just
       | save this stuff indefinitely. Suppose a 14 year old sexts a 12
       | year old. That is technically child porn and so retention is
       | often illegal.
        
       | outworlder wrote:
       | > these notifications give the sense that Apple is watching over
       | the user's shoulder--and in the case of under-13s, that's
       | essentially what Apple has given parents the ability to do.
       | 
       | Well, yes? Parents are already legally responsible for their
       | young children and under their supervision. The alternative would
       | be to not even give such young children these kind of devices to
       | begin with - which might actually be preferable.
       | 
       | > this system will give parents who do not have the best
       | interests of their children in mind one more way to monitor and
       | control them
       | 
       | True. But the ability to send or receive explicit images would
       | most likely not be the biggest issue they would be facing.
       | 
       | I understand the slippery slope argument the EFF is making, but
       | they should keep to the government angle. Having the ability for
       | governments to deploy specific machine learning classifiers is
       | not a good thing.
        
       ___________________________________________________________________
       (page generated 2021-08-06 23:03 UTC)