[HN Gopher] A Message to Our Customers (2016)
___________________________________________________________________
A Message to Our Customers (2016)
Author : AnotherTechie
Score : 87 points
Date : 2021-08-10 20:15 UTC (2 hours ago)
(HTM) web link (www.apple.com)
(TXT) w3m dump (www.apple.com)
| GeekyBear wrote:
| There is a story that has information direct from the horses
| mouth on how Apple will approach this as opposed to how Google,
| Microsoft, Facebook, Twitter and the rest already do.
|
| https://techcrunch.com/2021/08/10/interview-apples-head-of-p...
|
| Highlights:
|
| Unlike Google, Microsoft, Facebook, and the rest Apple has not
| been scanning your online data (iCloud) for the past decade.
|
| When this is turned on, only images you attempt upload to iCloud
| will be scanned.
|
| If you turn off photo synching to iCloud, nothing will be
| scanned.
|
| If photo scanning shows that many images on your device match
| known kiddie porn images (not just one), a human will review the
| data to make sure passing it on to the authorities is called for
| or if there have been multiple false positives.
|
| If multiple images do not match known kiddie porn images, nothing
| happens.
| starkd wrote:
| I don't think the concern is scanning and uploading images.
| This could be justified in some cases. But once you open this
| door, it can easily expand to scan for other things. And people
| don't always read the notices for every update.
| GeekyBear wrote:
| We already know that Apple will publicly fight Government
| demands to break device encryption.
|
| They have already proven that.
| idunnoman wrote:
| They previously said they _couldn 't_ comply. Now they will
| have to say they _won 't_ comply.
| GeekyBear wrote:
| They can't comply now.
|
| Unlike Google, they set up a system where no data hits
| Apple's server unless multiple images match known
| examples of kiddie porn.
| Croftengea wrote:
| The key word is _publicly_... Or I missed <sarcasm> tags?
| GeekyBear wrote:
| Did I miss the sarcasm tag when it is implied that Google
| is immune to government demands in a manner that Apple is
| not?
| gigel82 wrote:
| Google does not have the capability to scan local files
| on your device. They still have the luxury of answering
| "we can't" to such demands.
|
| But Google steps on your privacy in so many other ways,
| it's probably not worth defending this one technicality.
| GeekyBear wrote:
| Google conducts the exact same scans on their own
| servers, which means the results for a single false
| positive must be handed over to anyone with a subpoena.
|
| With Apple's system, no data hits their servers until
| multiple images match known examples of kiddie porn.
|
| If there is a single false positive, Apple won't even
| know about it.
|
| You can't provide data you never had, so Apple's system
| is much more private.
| ByteWelder wrote:
| They cannot publicly fight FISA court gag orders. They
| can't even so much as mention those.
| firebaze wrote:
| I'll never buy anything from Apple again. Apple was to me the
| walled but good-willed garden, caring about their profits by
| respecting their customers and taking a stance against widespread
| anti-democratic tendencies. I own an iWatch, two iPhones and two
| MacBook Pros (one privately owned, one from my current employer).
|
| The selling points of apple to me were to provide excellent
| hardware combined with excellent software, combined with a
| guarantee to protect my privacy.
|
| The first point still holds true, the 2nd not so much anymore,
| and the 3rd was destroyed by the most recent move.
|
| My stance will cause a ripple effect, I convinced quite a few
| people to use apple if they can afford it due to their general
| stance and their commitment to democratic values. Not all of them
| will listen if I now tell the opposite story, but most will. I
| hope Apple feels the effects of this decision in one of the
| upcoming stock-holder meetings.
|
| Of course, I don't believe this helps against child abuse or any
| crime at all, in fact I believe the opposite effect happens:
| criminals probably know about moves like this one far earlier
| than the general public and react accordingly.
| starkd wrote:
| this is why I refuse to get into the habit of using a smart
| phone. I'm totally against child porn, but, while doing nothing
| to fight child porn, it will allow a foothold to eventually
| scan for who knows what. Criminals will easily evade it by
| turning off the scanning.
| Tepix wrote:
| What are the best options for a Linux laptop these days?
|
| I switched back from a Dell XPS 13 9350 running Ubuntu to a
| Macbook Air M1 quite recently.
| juniperplant wrote:
| Beside the Dell XPS Developer Edition (that you already
| mentioned) I would consider the following:
|
| - Lenovo laptops with Fedora preinstalled[1]
|
| - Clevo and its HW customers: System76 and Tuxedo being the
| most notable ones (I think)
|
| - Purism Librem 14
|
| - Framework modular laptop[2]
|
| [1] https://fedoramagazine.org/lenovo-fedora-now-available/
|
| [2] https://frame.work/
| zionic wrote:
| Oh how the mighty have fallen.
|
| Now I can't help but wonder if this was all for show.
| Causality1 wrote:
| Of course it was. There's no such thing as a billion dollar
| company that isn't functionally psychopathic, let alone a
| trillion dollar company. Never forget these devices are made in
| factories with suicide nets.
| ChrisArchitect wrote:
| bunch of discussion from when this was news:
|
| https://news.ycombinator.com/item?id=11116274
| [deleted]
| frostyiguana wrote:
| I don't think it's fair to hold anyone to thoughts or idea they
| had. one can assume an opinion, evolve and present a different
| opinion after a while, that's why we live and look up for new
| experiences.
|
| I don't agree with today's apple shift on encryption and
| disregard of privacy but we should also make sure not to hide the
| huge problematic that global interconnected networks have right
| now on vulnerable people, their lives and the lives of the ones
| around them.
|
| be on guard against threat to privacy is important but maybe we
| should focus on finding solution for these problems too
| idunnoman wrote:
| It is remarkable that Tim so clearly understood the problem in
| 2016.
|
| With that one post, Apple and Tim earned trust from a group of
| people that trust very few. And in an instance, both Apple and
| Tim have now burned all of it.
| mfer wrote:
| This whole this is complicated. Trying to do right for people
| on privacy while also doing right by endangered children. I
| like what Alex Stamos had to say about it...
| https://mobile.twitter.com/alexstamos/status/142405454287900...
| starkd wrote:
| There seems to be an ethos in tech that says if you can do
| something to prevent this evil in society you are obliged to
| do it. No consideration for underlying principles, and no
| distinction between what are values and what are principles.
| You compromise values all the time. Principles are steadfast.
| idunnoman wrote:
| I dunno man.
|
| I _think_ I can see the issue pretty clearly here.
|
| - Real harm is enabled with encryption. I get it.
|
| - Back doors break encryption for everyone and don't stop
| encryption for bad guys.
|
| Am I missing something?
| shmde wrote:
| Will only be available in wayback machine in a matter of time.
| literallyaduck wrote:
| The way back machine will be perceived as a threat to the
| ministry of Truth.
| snowwrestler wrote:
| What is the functional difference between the government
| demanding Apple add code to break device encryption, and the
| government demanding Apple add signatures that extend their on-
| device scanning beyond its intended scope of CSAM?
|
| Apple seems to get a lot of credit for opposing the former, but
| gets mocked when they say they would oppose the latter. But as
| far as I can tell, the legal argument is exactly the same for
| both situations: can the government compel Apple to add
| functionality that they do not want to add?
|
| Apple's plans seem creepy to me, but I have been less than
| impressed with the specificity of arguments against it. Most seem
| to stop at "what if the government forces them to expand it"
| without addressing exactly how, under current federal law, the
| government would do that.
|
| For example, see this Twitter thread arguing that it would be
| very difficult for the feds to do that:
|
| https://twitter.com/pwnallthethings/status/14248736290037022...
| mfer wrote:
| > Apple's plans seem creepy to me, but I have been less than
| impressed with the specificity of arguments against it. Most
| seem to stop at "what if the government forces them to expand
| it" without addressing exactly how, under current federal law,
| the government would do that.
|
| It's not "the government". There are many governments around
| the world. What happens when China, Russia, or another country
| legislates using this technology for some other purpose. Those
| are big markets. Will Apple back out of them or give in?
| mikenew wrote:
| > Will Apple back out of them or give in?
|
| They will give in, at least in China. They currently host all
| of their iCloud content in China on Chinese servers (and turn
| over encryption keys), they have banned all VPN apps from the
| Chinese app store, and they removed the Hong Kong protest app
| at the behest of the CCP. They will do whatever China tells
| them to, because, at least from their perspective, they have
| to. All their manufacturing is in China.
|
| I can't even imagine an outcome where Apple _doesn 't_ start
| looking for pictures of tank man or anti-government images on
| Chinese citizen's phones. The Chinese government will hand
| them a list of hashes and say "these photos are illegal here,
| tell us whenever you find one". Maybe Apple will hold the
| line of "only photos uploaded to iCloud", but even then they
| just built the capability to scan everything on someone's
| phone, and the iCloud part is simply a switch that we have to
| hope they don't flip.
|
| I'm trying not to be too hopelessly negative here but I can't
| believe Apple decided that encrypting iCloud backups is worth
| trading for a file scanner on your phone. What the fuck.
| slg wrote:
| >What is the functional difference between the government
| demanding Apple add code to break device encryption, and the
| government demanding Apple add signatures that extend their on-
| device scanning beyond its intended scope of CSAM?
|
| Is this meant as a rhetorical question? Because they are pretty
| different from both a technical and policy perspective.
|
| Breaking encryption means the government can have access to
| everything without restriction. It also means there is a
| backdoor for others to discover.
|
| This approach of matching signatures means that the government
| needs to have specific content it is looking to match. The
| government asks "does the device have this specific file" and
| Apple returns a yes or no. They can't do broad searches for
| unknown content. Apple also remains as the gatekeeper between
| its users and the government when it comes to extending the
| scanning.
|
| We can still be against the latter while acknowledging that
| this isn't as scary a scenario as the former and therefore it
| isn't purely a legal question of which approach Apple would be
| more likely to accept.
| snowwrestler wrote:
| The case this 2016 letter is about was not a request to break
| phone encryption in general. The government asked Apple to
| assist only in getting into a few specific phones for a
| specific reason, under the authority of a valid warrant. And
| many folks thought Apple had a strong legal case to say no.
|
| Apple can't search phones under the technology they
| announced, so the government can't ask Apple for information
| about what is on people's phones.
|
| The government could only ask Apple to _add_ hashes to an
| operating system that Apple runs. Structurally, this is the
| same as asking them to add functionality, which is what they
| objected to in 2016.
|
| There is also a scope issue; if every iPhone has the same
| hash list, then the government is essentially fishing in
| everyone's phone for a file. This is typically illegal. The
| government has to be specific about why they think a certain
| person/people have a piece of data before they can get a
| warrant to go get it.
|
| Remember that (as the Twitter thread reminds us) the entire
| CSAM scanning effort is voluntary. The government is not
| forcing Apple to scan for CSAM, so how would they force Apple
| to scan for anything else?
| slg wrote:
| >Remember that (as the Twitter thread reminds us) the
| entire CSAM scanning effort is voluntary. The government is
| not forcing Apple to scan for CSAM, so how would they force
| Apple to scan for anything else?
|
| Which is the point that you seem to largely be ignoring.
| Apple has its own motivations here and it isn't purely a
| question of what the government is forcing them to do.
| Apple knows that once encryption is broken, it is broken
| for everything. This new proposal is much more targeted and
| gives Apple control while also preserving their ability to
| say no on technical grounds for further privacy invasions.
| That is why they would prefer it over the previous
| government proposal.
| zionic wrote:
| The list Apple uses is the property of the secret police,
| which is owned by the government. The government can change
| the database at a whim and push new targeting data to your
| phone.
| idunnoman wrote:
| I mean this genuinely. Didn't we learn with Snowden why this
| argument isn't valid?
|
| The government _does_ break these laws to get what they want
| _AND_ they silence the people that they force to break the
| laws.
|
| Why are we pretending that anything has changed?
| snowwrestler wrote:
| If our starting belief is "the government will secretly force
| Apple to do things no matter what the law says," then why do
| we care what Apple says or announces, at all?
|
| Why get mad at Apple if we have already conceded that they
| are powerless before the government in general?
| idunnoman wrote:
| This is a good point.
|
| I do believe we should be skeptical of these companies
| stated positions unless we can see a profit motive. The
| previous stance that Apple _said_ they had was "we value
| your privacy and you should pay us for that".
|
| They also demonstrated in the case in 2016 with terrorists
| and the FBI that they meant it.
|
| In this case, they have flipped entirely, and are now
| adding features without being compelled that subvert that
| stated goal.
|
| Apple will scan your phone/data without a warrant AND
| report to the government now. This is their public opinion
| now. Forget their compelled and forced actions. Now they
| _are proud to be the bad guys_.
| shuckles wrote:
| So what's the profit motive?
| idunnoman wrote:
| I don't see it. Clearly our motivations are misaligned.
| I'm not confident you can sell me on the idea that this
| will get people to trust them more, and therefore buy
| more apple stuff.
| mfer wrote:
| Our two mainstream options for mobile OS are one that stalks you
| everywhere you go and monitors what you do for the company behind
| it and one that will look on device at your pictures to possibly
| report to the government (with other governments likely licking
| their lips). This two party system is starting to stink.
|
| > "However mobile OS's may now and then answer popular ends, they
| are likely in the course of time and things, to become potent
| engines, by which cunning, ambitious, and unprincipled men will
| be enabled to subvert the power of the people and to usurp for
| themselves the reins of government, destroying afterwards the
| very engines which have lifted them to unjust dominion." --
| George Washington
| post_break wrote:
| It's very similar when your ISP choices are Comcast and AT&T or
| insert other horrible ISP. If they don't do an about face on
| this I'm pretty much done with iPhone forever. Yes Google loves
| my location/search data, but what are my options? It's so
| frustrating.
| GeekyBear wrote:
| Google has been scanning everything in your account for
| kiddie porn for the past decade.
|
| >a man [was] arrested on child pornography charges, after
| Google tipped off authorities about illegal images found in
| the Houston suspect's Gmail account
|
| https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-
| le...
|
| In the case of a false positive, that information lives on
| Google's server where it can be subpoenaed and misused to
| incriminate you.
|
| We've seen it before with location data.
|
| >Innocent man, 23, sues Arizona police for $1.5million after
| being arrested for murder and jailed for six days when
| Google's GPS tracker wrongly placed him at the scene of the
| 2018 crime
|
| https://www.dailymail.co.uk/news/article-7897319/Police-
| arre...
|
| With Apple's system, a single false positive would never even
| leave the device. Multiple images have to be found to match
| known kiddie porn images before a human review is triggered.
| post_break wrote:
| That's all great and all but Google hasn't been touting
| privacy features as a selling point. With Apples decision
| to move forward with this I'm not beholden to any company
| for my phone. At least with android I can side load and do
| what I want.
| GeekyBear wrote:
| Not having any data on a single false positives ever
| leave your device is MUCH more private than having that
| data exist on Google's servers.
|
| Also, I have huge doubts that Google is reviewing the
| data to be sure it isn't a false positive before handing
| it over to the authorities.
|
| They very famously refuse to hire expensive human beings
| when flawed machine learning algorithms are cheaper.
| post_break wrote:
| I cant seem to reply to your other comment. Here is where
| they talk about expanding it to third party apps:
|
| "Apple said that while it does not have anything to share
| today in terms of an announcement, expanding the child
| safety features to third parties so that users are even
| more broadly protected would be a desirable goal."
|
| https://www.macrumors.com/2021/08/09/apple-child-safety-
| feat...
|
| Do you really think they would go to all this trouble and
| then say oh you don't want to get scanned? Just turn off
| iCloud photos.
| GeekyBear wrote:
| >Do you really think they would go to all this trouble
| and then say oh you don't want to get scanned? Just turn
| off iCloud photos.
|
| They have literally said exactly that.
|
| >Q: So if iCloud Photos is disabled, the system does not
| work, which is the public language in the FAQ. I just
| wanted to ask specifically, when you disable iCloud
| Photos, does this system continue to create hashes of
| your photos on device, or is it completely inactive at
| that point?
|
| A: If users are not using iCloud Photos, NeuralHash will
| not run
|
| https://techcrunch.com/2021/08/10/interview-apples-head-
| of-p...
| post_break wrote:
| If you fundamentally believe that Apple would go to all
| this trouble, create hashes, machine learning, all the
| code required, all the work with the third party, just to
| tell the world including those they hope to catch they
| can disable it by simply turning off iCloud photos, then
| there really isn't much left to discuss.
| GeekyBear wrote:
| If you choose to believe conspiracy theories instead of
| the truth, there really isn't much to discuss.
|
| Apple has merely developed a way to scan the contents of
| their cloud in a way that keeps the data about false
| positives off their servers until they are reasonably
| sure there is an issue. (Multiple images must match known
| examples of kiddie porn before a human review is
| triggered)
|
| Scanning on the server itself is way less private.
| post_break wrote:
| I never used iCloud photo. I never hosted my photos on
| Google photos. But when Apple talks about using their
| CSAM technology for 3rd party apps, even if I'm sending
| my photos to my private synology? Well that's when I see
| the writing on the wall and head out. There is no perfect
| candidate, but selling me on privacy, and doing what they
| are doing now is the line in the sand for me.
|
| https://www.macrumors.com/2021/08/09/apple-child-safety-
| feat...
| GeekyBear wrote:
| >when Apple talks about using their CSAM technology for
| 3rd party apps
|
| Citation needed.
|
| Apple has made no such claim and has made it clear that
| if you turn iCloud photos off, nothing is scanned.
|
| >If users are not using iCloud Photos, NeuralHash will
| not run
|
| https://techcrunch.com/2021/08/10/interview-apples-head-
| of-p...
| mfer wrote:
| PinePhone? https://www.pine64.org/pinephone/
|
| I'm frustrated, too.
| executive wrote:
| Too slow & small battery for use as primary device. Cool
| toy at this point.
|
| https://youtu.be/fCKMxzz9cjs
___________________________________________________________________
(page generated 2021-08-10 23:02 UTC)