[HN Gopher] Princeton Researchers Who Built a CSAM Scanning Syst...
___________________________________________________________________
Princeton Researchers Who Built a CSAM Scanning System Urge Apple
to Not Use It
Author : nathan_phoenix
Score : 273 points
Date : 2021-08-20 13:59 UTC (9 hours ago)
(HTM) web link (www.macrumors.com)
(TXT) w3m dump (www.macrumors.com)
| ummonk wrote:
| Nothing about those concerns seems specific to the end-to-end
| encryption compatible CSAM system they or Apple built...
|
| Honestly if I were Apple I'd consider just scrapping the whole
| thing and doing server side CSAM testing on iCloud photos without
| rolling out E2E encryption for iCloud photos. It's just not worth
| the PR blowback.
| kenjackson wrote:
| I agree. Apple already, AFAIK, requires iCloud uploading to be
| turned on in order for them to do scanning on the device. The
| delta of scanning on devices (already with iCloud enabled) and
| iCloud scanning doesn't seem to be worth the headache.
| rgovostes wrote:
| Which is the perplexing thing here. The frenzy to paint this as
| anti-privacy given outlandish scenarios could very well end up
| killing a very pro-privacy move towards stronger encryption for
| everyone.
|
| It would be _vastly_ easier for them to do this on the server,
| and never expend the engineering effort to implement device-
| side scanning with all these cryptographic safeguards, and
| altogether abandon E2EE for photos, not to mention the lobbying
| effort to protect E2EE against legislation.
|
| _Today_ governments could demand Apple scan through every
| photo in iCloud. iOS _today_ does NN-based classification of
| your photo library and Apple could in theory be compelled to
| modify this to report on certain kinds of content.
|
| (You can argue: The best for privacy is E2EE without CSAM
| scanning. This puts them on much less secure footing to defend
| against anti-E2EE legislation, so it would be risky.)
| naasking wrote:
| > The frenzy to paint this as anti-privacy given outlandish
| scenarios could very well end up killing a very pro-privacy
| move towards stronger encryption for everyone
|
| I'm not sure a device you own silently scanning your files at
| someone else's behest can really be counted as a privacy win.
| You have no idea what it's actually doing, or how this
| process could be subverted for criminal or political
| purposes.
| n8cpdx wrote:
| That might be slightly valid if Apple had any plans to do e2e
| encryption. The fact that they haven't said a word on that
| front would seem to indicate that they have no such plans,
| have never had such plans, and are not willing to make such
| plans.
|
| Another alternative would be to scan for CSAM and simply
| choose not to upload it and show a warning instead. I don't
| get why folks are overlooking the absolutely critical _and
| then it snitches to Apple and/or the government_ part.
|
| P.S. it is weird to say Apple is "abandoning" an e2e
| encryption feature it doesn't have, has never announced, and
| has made no mention of at any point, despite clear public
| pressure.
| azinman2 wrote:
| My understanding of the law is if you believe you have CP
| you must report it.
| rgovostes wrote:
| Apple has not announced anything about E2EE, you are right,
| that is speculative. I think this intention is telegraphed
| by a few recent actions they've taken, from strongest to
| weakest evidence:
|
| 1. The fact that they built device-side scanning in the
| first place when they could have done it quietly on the
| server. They invested significant engineering time into the
| advanced cryptographic algorithms used, such as the private
| set intersection and threshold secret sharing. They had
| their work vetted, in public, by well-known cryptographers
| from academia. This is all leading-edge cryptographic
| engineering, most of which I would bet is being deployed in
| a consumer product for the first time ever.
|
| It only makes sense that they would do this if they intend
| to cut off the capability for this type of scanning on the
| server altogether.
|
| 2. They recently added more account recovery options, a
| necessary step towards helping the average user avoid
| catastrophic loss of E2EE data if they need to reset their
| password.
|
| 3. They announced well outside of their regular cycle of
| product announcements (June, September). This is unusual
| for Apple and could mean they plan to announce something
| else like E2EE in September. In their public statements
| they have indicated that this is one component of future
| unspecified plans.
|
| iMessage, FaceTime, iCloud Keychain, etc. are all E2EE. It
| is not completely unreasonable to think that they would
| endeavor to encrypt more content. If there were political
| hurdles to doing so in the past (as reported), these might
| be mitigated by the new features.
| nybble41 wrote:
| End-to-end encryption only means something if you and whoever
| you intend to communicate with control the endpoints. With an
| iOS device, you don't: Apple controls the endpoint. You may
| have paid for the device, but Apple has exclusive control of
| the software. You don't have the ability to replace it with
| third-party software or even to audit the source code for the
| software Apple supplies and verify that it matches the
| binaries. Consequently, you have to assume that anything you
| put on the device has already been shared with Apple, simply
| because Apple's software has access to it. This move just
| makes their control over the operation of "your" device more
| explicit.
|
| The best for privacy would be E2EE where the backup software
| never even has _access_ to the unencrypted data. In other
| words, the backup image(s) are created locally by auditable
| open-source (or at least source-available) software,
| encrypted and signed using a key or password known only to
| the device owner, and provided to the backup service as
| opaque blob(s). The backup service never has access to the
| raw data and is incapable of accessing or tampering with the
| content of the backup. Untrusted services would be restricted
| to have access to _either_ plaintext data _or_ the network--
| never both. Any network application wanting access to local
| photos, e.g. for sharing, would need to go through a
| privileged photo-choosing interface and would only be granted
| access to the selected photo(s). Even this has some concerns
| since the local photo software could embed information into
| unrelated photos to exfiltrate data to the network when those
| photos are shared, but you could at least compartmentalize
| the data to limit the potential for cross-contamination.
|
| Or, you know, just use a fully open device and open source
| software in the first place so you're not playing silly
| adversarial games with a device you rely on for everything.
| ummonk wrote:
| > Or, you know, just use a fully open device and open
| source software in the first place so you're not playing
| silly adversarial games with a device you rely on for
| everything.
|
| Are you using open source hardware? And then how are you
| auditing the supply chain to ensure the hardware you're
| receiving matches the open sourced design?
| zepto wrote:
| Here's their paper:
| https://www.usenix.org/system/files/sec21-kulshrestha.pdf
|
| Their system is vulnerable in all the ways they claim.
|
| However Apple's system is not the same and does contain
| mitigations.
|
| > Apple's muted response about possible misuse is especially
| puzzling because it's a high-profile flip-flop.
|
| This is a dishonest statement. Apple has not been muted about the
| concerns these researchers are presenting.
|
| They address them here: https://www.apple.com/child-
| safety/pdf/Security_Threat_Model...
|
| There is nothing in this piece that relates to Apple's actual
| technology.
|
| These researchers obviously have the background to review what
| Apple has said and identify flaws, but they have not done so
| here.
| arpstick wrote:
| you know, one surefire way to signal to apple that deployment of
| such technology is detrimental to their bottom line is to just
| not buy apple products anymore.
| HumblyTossed wrote:
| Another way is to publicly protest. If enough people are loud
| enough, this goes away.
| flatiron wrote:
| totally against this CSAM thing but totally not going to
| public protest when someone could EASILY twist this to "they
| wanted the right to keep child porn on their phones without
| apple knowing about it"
| loteck wrote:
| Have an upvote. flatiron is describing a perpetual and
| insidious issue with activism: many people believe that
| speaking out on a controversial issue will mean they will
| be tied to the worst extremes that could follow from the
| opinions they express. This fear silences them.
|
| This problem is especially prominent in the topic of
| privacy, which has been made to have a waft of wrongdoing
| associated with it ("nothing to hide").
| user-the-name wrote:
| They are also describing an issue, which like pretty much
| the entirety of this Apple fiasco, is _completely
| fictional_.
|
| Most of this outrage is about situations people are
| imagining, situations which often have only the most
| tenuous of relationships with the actual reality of what
| is happening.
| idunnoman wrote:
| The easiest example I've heard is kinda gross and rude,
| but effective.
|
| "You take a crap with the door closed, not because I
| don't know what you're doing in there, but because you
| don't want to share the experience with me."
|
| That pretty much sums up everything you need to know
| about privacy. You don't have to hide _anything_ in order
| for it to be important.
| nybble41 wrote:
| Not a bad example, but some would argue that it wouldn't
| really make any difference if such things were not
| private--that it's more a matter of habit than any actual
| advantage. In nature you would want to know that you're
| alone during that time because it renders you relatively
| defenseless, physically, but that's not _quite_ as much
| of an issue in modern society. As such, I prefer to point
| to situations where privacy remains a practical matter of
| self-defense: you don 't share detailed financial data or
| your most intimate emotions with the world because there
| are people who can use that information to manipulate you
| or otherwise take advantage, e.g. through social
| engineering. You're not doing anything _wrong_ --I'm not
| talking about potential blackmail material here--but
| knowing how much you earn and where you shop and what you
| buy and how you feel and what topics are likely to
| provoke an emotional response from you can give someone a
| great deal of leverage over you, often without you even
| realizing that you're being manipulated. Advertising is
| one obvious example of this, but not the only one. Being
| too open about your private life makes you vulnerable.
| nonbirithm wrote:
| As someone said earlier, it punts the argument to how
| many flaws the scanning process has, instead of the fact
| that there's even a process to begin with.
| n8cpdx wrote:
| Are there any privacy rights you would be willing to stand
| up for?
| [deleted]
| shadilay wrote:
| There should be a prize for the person who comes up the name for
| the CSAM version of swatting.
| palijer wrote:
| Kidmailing? Based on blackmail... But it's not really
| blackmailing...
| xanaxagoras wrote:
| Given the C and the A and the proximity of swatting to
| spanking, I propose switching, as in beating a child with a
| switch which is pretty well recognized as abuse these days.
| [deleted]
| 1cvmask wrote:
| I am reposting a comment to the original article that appeared in
| the Washington Post originally because of the slippery slope
| dangers (https://en.wikipedia.org/wiki/Slippery_slope):
|
| In a previous comment on this very same subject on Apple's
| attempt to flag CSAM I wrote: This invasive capability on the
| device level is a massive intrusion on everyone's privacy and
| there will be no limits for governments to expand it's reach once
| implemented. The scope will always broaden. Well in the article
| they correctly point out how the scope of scanning is already
| broad by governments around the world and a violation of privacy
| by content matching political speech and other forms of
| censorship and government tracking. We already have that now on
| the big tech platforms like Twitter that censor or shadow ban
| contetnt that they as the arbiters (egged on by the politicians
| and big corporate media) of truth (or truthiness as Colbert used
| to say in the old show The Colbert Report) label as
| misinformation or disinformation. Do we now need to be prevented
| from communicating our thoughts and punished for spreading the
| truth or non-truths, especially given the false positives, and
| malware injections and remote device takeovers and hijackings by
| the Orwellian Big Tech oligopolies. Power corrupts absolutely and
| this is too much power in the hands of Big Corporations and
| Governments. From the article in case you need the lowdown: Our
| system could be easily repurposed for surveillance and
| censorship. The design wasn't restricted to a specific category
| of content; a service could simply swap in any content-matching
| database, and the person using that service would be none the
| wiser. A foreign government could, for example, compel a service
| to out people sharing disfavored political speech. That's no
| hypothetical: WeChat, the popular Chinese messaging app, already
| uses content matching to identify dissident material. India
| enacted rules this year that could require pre-screening content
| critical of government policy. Russia recently fined Google,
| Facebook and Twitter for not removing pro-democracy protest
| materials. We spotted other shortcomings. The content-matching
| process could have false positives, and malicious users could
| game the system to subject innocent users to scrutiny. We were so
| disturbed that we took a step we hadn't seen before in computer
| science literature: We warned against our own system design,
| urging further research on how to mitigate the serious downsides.
| We'd planned to discuss paths forward at an academic conference
| this month. That dialogue never happened. The week before our
| presentation, Apple announced it would deploy its nearly
| identical system on iCloud Photos, which exists on more than 1.5
| billion devices. Apple's motivation, like ours, was to protect
| children. And its system was technically more efficient and
| capable than ours. But we were baffled to see that Apple had few
| answers for the hard questions we'd surfaced. China is Apple's
| second-largest market, with probably hundreds of millions of
| devices. What stops the Chinese government from demanding Apple
| scan those devices for pro-democracy materials? Absolutely
| nothing, except Apple's solemn promise. This is the same Apple
| that blocked Chinese citizens from apps that allow access to
| censored material, that acceded to China's demand to store user
| data in state-owned data centers and whose chief executive
| infamously declared, "We follow the law wherever we do business."
| Apple's muted response about possible misuse is especially
| puzzling because it's a high-profile flip-flop. After the 2015
| terrorist attack in San Bernardino, Calif., the Justice
| Department tried to compel Apple to facilitate access to a
| perpetrator's encrypted iPhone. Apple refused, swearing in court
| filings that if it were to build such a capability once, all bets
| were off about how that capability might be used in future.
| shadilay wrote:
| Pilcrow?
| lamontcg wrote:
| I'm honestly more concerned about the immediate effects today
| in the USA than what China can do with it. And with the
| slippery slope effect in the United States.
|
| Today it is "think of the children" tomorrow it is "prevent
| terrorist attacks", eventually it is "citizen why do you have
| union organizing material on your iCloud uploads?"
|
| And we've seen the big tech giants steadily morphing into quasi
| governmental agencies and we've been steadily giving up privacy
| rights over the decades. I don't think its a huge leap from
| where we are now to having Apple/Google/Amazon devices that are
| monitoring what we say AND reporting on suspicious activity.
| Particularly when combined with those companies wanting to own
| your private time and link all your accounts with them.
| the_snooze wrote:
| Here's the link to those researchers' words directly:
| https://www.washingtonpost.com/opinions/2021/08/19/apple-csa...
| zepto wrote:
| That link is behind a paywall.
| skygazer wrote:
| It's a soft paywall, protected only by cookies and
| JavaScript, bypass-able by using private browsing/incognito
| mode. Most news sites are -- they give you your first N
| articles free and try to upsell, with the assumption that few
| will know to or bother to reset cookies, disable JavaScript
| or browse privately/incognito. (Surprisingly, the full text
| of the article is usually in the initial http response and
| only hidden by JavaScript running in your browser after the
| fact, based on the presence of a cookie from a prior visit -
| such a fragile and trusting system, but worth the upsell
| trade off to them.)
|
| You can also go to https://archive.is and paste in the
| article url. They generate nice shareable url like this one
| for this article: https://archive.is/y58Py
|
| Ironically, the Google AMP version of the article will have
| no paywall at all, and is usually accessible with a slightly
| modified article url, discoverable via Google search from
| mobile. In the case of the Washington Post simply appending
| outputType=amp does the trick.
| https://www.washingtonpost.com/opinions/2021/08/19/apple-
| csa...
|
| Reader modes, like Apple's, frequently, inadvertently, bypass
| soft paywalls, particularly when set to activate
| automatically, because they can see the hidden content on the
| page before it's hidden.
|
| There are of course browser extensions that automate all of
| this.
|
| Few news sites use hard paywalls that prevent this. Whether
| you're comfortable using private browsing or Google amp and
| not paying for the article is up to you.
| zepto wrote:
| > It's a soft paywall, protected only by cookies and
| JavaScript, bypass-able by using private browsing/incognito
| mode.
|
| Not true. I tried that.
|
| > They generate nice shareable url like this one for this
| article: https://archive.is/y58Py
|
| This is helpful.
| skygazer wrote:
| Well, I can't know if you're invoking it properly, but
| it's true of modern browsers. That's the whole purpose of
| private/incognito mode, to not share your previous
| cookies with the visited site, so if it doesn't work for
| you and you're entering private/incognito before visiting
| the article url, then there's something amiss with your
| configuration and you're not browsing privately at all.
| Or, perhaps you never leave private browsing mode, and so
| you keep your private mode cookies alive indefinitely,
| defeating the whole purpose of private mode --
| ironically, it works best if you don't leave it on
| continuously. There's a lot of subtlety in technology. :)
|
| The full article is indeed returned upon your request and
| hidden by your browser after the fact. You can prove this
| to yourself by dropping to a unix-like command line and
| typing: curl -L 'https://www.washingtonpo
| st.com/opinions/2021/08/19/apple-csam-abuse-encryption-
| security-privacy-dangerous'
|
| and you'll see, if you look carefully, the full text of
| the article embedded within that extremely verbose
| result, just as the browser sees it before it's hidden by
| javascript after inspecting your cookies. I'm not sure
| you wanted to know all of this, but it feels like wanting
| to understand how things work is why hacker news exists.
| zepto wrote:
| > Well, I can't know if you're invoking it properly, but
| it's true of modern browsers.
|
| No it isn't.
|
| > There's a lot of subtlety in technology. :)
|
| Agreed. Perhaps you are missing something.
|
| > The full article is indeed returned upon your request
| and hidden by your browser after the fact. You can prove
| this to yourself ...
|
| This is irrelevant to whether private browsing solves the
| problem or not.
| skygazer wrote:
| You are a delight. Thanks for brightening my day.
| [deleted]
| zepto wrote:
| Without access to a technical description of what they built, we
| have no idea whether it is relevant.
| HumblyTossed wrote:
| Right, so we should all just sit down, shut up and wait for
| Apple's implementation to be used nefariously.
| zepto wrote:
| > so we should all just sit down, shut up and wait for
| Apple's implementation to be used nefariously.
|
| That seems like a silly thing to suggest.
|
| I would be interested in knowing what these researchers
| developed, and how it relates to what Apple has done.
|
| If their work identifies technical weaknesses that apply,
| don't you want to know what they are?
|
| Edit: I found their work, and it doesn't apply.
| atlgator wrote:
| > A foreign government could, for example, compel a service to
| out people sharing disfavored political speech. That's no
| hypothetical: WeChat, the popular Chinese messaging app, already
| uses content matching to identify dissident material. India
| enacted rules this year that could require pre-screening content
| critical of government policy. Russia recently fined Google,
| Facebook and Twitter for not removing pro-democracy protest
| materials.
|
| This sums up the concern. CSAM is just the excuse because who
| would come out against protecting children, right? But this will
| absolutely be used for political purposes.
| sixothree wrote:
| This event here marks the end of privacy as a right. It's over.
| entropicdrifter wrote:
| It's been over for a long while for the average consumer
| rowanG077 wrote:
| This is definitely false. You can relatively easily live
| with privacy. It just takes a little bit of effort. It's
| over if you don't care and take no action.
| lstamour wrote:
| Privacy can only be maintained by policies and regulation,
| not by technology. This hashing and scanning client-side,
| broken as it might be, is merely a technology. Technology to
| ensure privacy, as established by corporations, can always be
| legally circumvented by governments and what they allow or
| require by policy for their region, if said government is
| part of a large enough market that companies have to pay
| attention to it. Therefore the only solution is global
| agreement to privacy as a human right... and that seems
| unlikely right now. Runner up would be individual alternative
| apps for privacy (such as a camera app) but those can be
| banned by policy depending on the region you're in, or
| potentially subject to the same rules/policies. In a
| democracy, if you disagree, your only option is talking to
| lawmakers, those who decide on policy. But again, it would
| likely only affect your region...
| ipaddr wrote:
| Apple brand is privacy against other companies competing with
| them. Apple has always had privacy concerns.
| api wrote:
| The more I think about it, something that concerns me greatly
| is the _silent_ use of this for political purposes.
|
| By silent I mean they don't arrest you or put you on any public
| lists. Instead they just gather information, lots of
| information, and they use it in future Cambridge Analytica
| style propaganda campaigns.
|
| I don't mean to imply that only Republicans would do this.
| After Trump's successful use of this kind of political
| propaganda every single political party and PR agency on the
| planet is working on duplicating it.
|
| Arresting people is old fashioned. Modern totalitarianism is
| built on disinformation, microtargeted propaganda,
| surveillance, and nudge theory.
| certeoun wrote:
| What makes us better than China then? Our relative freedoms?
| What can we do about it?
|
| Princeton also did a study on America not being actually a
| democracy. I shared it here:
|
| https://news.ycombinator.com/item?id=28249091
| southerntofu wrote:
| Nothing makes a country better than another, in absolute.
| Although, some aspects of life may be more enjoyable in one
| or another, depending on your taste.
|
| I think most of us can agree living in an abundance of
| resources (waste!) with some mild political freedom where
| you're allowed to say anything as long as you don't try to
| change anything (see: COINTELPRO, Julian Assange, etc) like
| in the Global North, is more pleasant to live on a daily
| basis than a hunger-ridden autocracy like North Korea. But
| even that is debatable and i'm sure many north koreans
| would disagree.
|
| What can we do about it? Attack the system, day by day and
| bits by bits. Steal what you can from the rich to
| redistribute to the poor. Make cooperatives, whether as
| employees or volunteers, so that more and more people can
| quit wage exploitation and start to live again, "from each
| according to their capabilities, to each according to their
| needs" (old anarchist saying). Form collectives for all
| kinds of struggles affecting you and your loved ones: anti-
| patriarchy, anti-racism, accessibility...
|
| Don't accept anyone claiming they're above you. We're all
| in this together, and those who claim to seek authority to
| find solutions will create more problems than they will
| solve, no matter their good intentions. We need to build
| "power to the people", not "power over the people", as
| anyone who has studied some history of political repression
| in the early Soviet Union (and other "dictatorships of the
| proletariat") can learn. See also Emma Goldman's "Trotsky
| protests too much" or "There is no communism in Russia" on
| this topic.
|
| Organize. Organize. Organize.
| DaftDank wrote:
| Sorry for the off-topic question, but I've been wondering
| for a while and have been too afraid to ask: what does the
| green name signify here?
| zargon wrote:
| It means they're a newly registered user. (Not sure of
| the exact threshold, around a few days to a week.)
| temp8964 wrote:
| I don't understand this logic, why would authoritarian
| governments need to fake political dissident material as child
| porn for matching? The examples you quoted directly invalidate
| this argument. The authoritarian governments can directly ban
| political dissident materials, and they don't need child porn
| as the excuse.
| ls612 wrote:
| Because the real concern isn't that; it's something like
| Apple vs FBI where the fact that Apple didn't have a
| capability built have them a legal edge against a government
| with stronger rule of law.
| derefr wrote:
| iMessage is E2E-encrypted; the only scanning of the content
| of the network a state adversary can do is by getting Apple
| to implement it for them, into the iMessage client.
|
| And Apple until now have had the excuse for not doing so,
| that they have no mechanism built into iMessage that would or
| _could_ scan for such things, nor (as the excuse would go) is
| such a technology likely to be either "feasible" given the
| E2E-encrypted nature of the communications, or "palatable" to
| their audience (i.e. an argument like "people buy the
| iDevices in part because of advertised privacy benefits, so
| such state-level intrusions would lead to nobody buying
| iDevices, so nobody would be caught by this scheme anyway, so
| all you're _really_ doing by asking us to implement it is
| trying to destroy our company's share price, probably to help
| your domestic phone market--and _our_ government wouldn't
| take kindly to that kind of market interference. So if you
| don't want to be slapped with even more Huawei-like trade
| sanctions, kindly go away.")
|
| The implementation of a CSAM scanner is a clear _feasibility
| proof_ for the set of technologically-equivalent capabilities
| (e.g. dissident-material scanning.) The fact that Apple
| _have_ implemented it, and that their users _have_ accepted
| it and continued to use their iDevices, means Apple has no
| leg to stand on when denying governments the implementation
| of technological enforcement for "banning political dissident
| materials."
| dwaite wrote:
| iMessage does not use the CSAM scanner, only the upload to
| iCloud Photos does.
|
| The iMessage feature is a local porn-ish detector on
| sent/received images with a click-through. It is only for
| children on family accounts, and if the child is under 13 a
| click-through is reported to the parent.
| [deleted]
| sam0x17 wrote:
| > pre-screening
|
| Pre-screening would be the kind way of implementing this
| feature. They could make it so if CSAM is detected, just don't
| upload those files to the cloud and delete them locally. Then
| they are still doing due-diligence to prevent that material
| from entering their cloud and to remove that material from
| circulation, but they don't get random people arrested for
| false positives. When they detect hash matches on a device,
| they are under zero legal obligation to do anything about it
| because it could easily be a false positive (as the public has
| demonstrated on HN and elsewhere), so they'd be in the clear
| with this approach. This will have the effect they desire
| without pissing tons of people off.
|
| Instead they have taken the aggressive stance of manual review
| + tipping off law enforcement, which goes against their entire
| mantra of protecting privacy. Your phone will now be an
| informant against you. If you are an activist or a
| whistleblower, a false positive could be enough to get your
| device searched and seized.
| certeoun wrote:
| @sam0x17, are you aware of this finding?:
|
| https://www.cambridge.org/core/journals/perspectives-on-
| poli...
|
| > Despite the seemingly strong empirical support in previous
| studies for theories of majoritarian democracy, our analyses
| suggest that majorities of the American public actually have
| little influence over the policies our government adopts.
| Americans do enjoy many features central to democratic
| governance, such as regular elections, freedom of speech and
| association, and a widespread (if still contested) franchise.
| But we believe that if policymaking is dominated by powerful
| business organizations and a small number of affluent
| Americans, then America's claims to being a democratic
| society are seriously threatened.
|
| Especially the part with "policymaking is dominated by
| powerful business organizations and a small number of
| affluent Americans" made me think that it is not only those
| elites who do this or businesses. Does this mean that the
| government (NSA) doesn't care what we want? They are allowed
| to essentially patronize us because we do not know any
| better? Can you explain to me why the NSA is spying on us
| despite it being unpopular? We didn't ask to be spied on,
| right? I don't understand politics or the role of government
| anymore. I am utterly confused. Politics is so contradictory
| I cannot wrap my head around it. Or perhaps, I am missing
| something?
| MichaelGroves wrote:
| I think the practical effect, if not intended design
| purpose, of a republic is to launder responsibility. To
| create a layer of indirection and uncertainty between the
| people who have power and the popular perception of who to
| hold accountable. The wealthy write the rules and use
| politicians as patsies. In return for their service to the
| elite, politicians are offered some privileges and a degree
| of protection from the angry mobs. The mobs are made to
| believe that the most effective way to effect change is to
| vote in new politicians, allowing the old ones to
| peacefully retire. The economic elite rest easy, knowing
| the new politicians will serve their interests just as the
| old ones did.
|
| Sometimes a renegade politician who earnestly has the
| interests of the common people gets voted into power, but
| the 'damage' such a renegade can do is regulated by term
| limits (and sometimes assassination:
| https://en.wikipedia.org/wiki/Gracchi) Other times, the
| public believe they are voting for such a renegade, but
| accidentally empower a tyrant who aims to usurp the elite
| and have true power for himself. But by in large, a
| republic regulates the system, maintaining the status quo
| to the benefit of those who already have power in the
| status quo.
| cutemonster wrote:
| > Sometimes a renegade politician who earnestly has the
| interests of the common people gets voted into power, but
| the 'damage' such a renegade can do is regulated by term
| limits
|
| Interesting to view the term limit, as a way to protect
| the wealthy from the public
| sam0x17 wrote:
| Yes I completely agree with this analysis. The only
| solution in my opinion is to elect leaders willing to
| dismantle this structure a bit, i.e. there are a number of
| senators and house members on the far left (and probably
| even on the right) that would vote for a bill that sets a
| per-entity yearly political donation cap at $500 if given
| the chance (meaning an individual or a corporation can only
| donate $500 per candidate per year). Centrists would never
| do this because they get tons of corporate money, but if we
| remove the incentives completely through legislation, we'd
| probably see a much less corrupt governmental structure at
| the end of the day.
| robertoandred wrote:
| Apple's system doesn't match content.
| ASalazarMX wrote:
| This sounds the same as claiming that biometric fingerprint
| devices don't match user's fingerprints, they just match
| vectors.
| [deleted]
| [deleted]
| ibigb wrote:
| It seems apple devices are a black opaque box:
| support.apple.com/en-us/HT202303 Every thing is encrypted
| except imap email storage. They have no access to anything
| except that. They can't do anyting with any data except imap
| emails stored on their server.
|
| The governments want them to do some thing preventing CSAM,
| so now they can match a perceptual hash--not content--to
| known images from an non-governmental organization. That is
| about the minimum invasive CSAM prevention anybody can do.
|
| Can someone suggest an alternative CSAM prevention which is
| less intrusive?
|
| It seems the alternative would be to continue to be a black
| opaque box, or NOT encrypt you photos and rumage through
| them.
| kevinherron wrote:
| They _can_ rummage through your photos. They are encrypted
| in transit and on the server but with a key that they have
| access to.
|
| If it's not explicitly listed in the end-to-end encryption
| section then assume Apple can access it.
| ibigb wrote:
| From https://support.apple.com/en-us/HT202303 >End-to-end
| encryption provides the highest level of data security.
| Your data is protected with a key derived from
| information unique to your device, combined with your
| device passcode, which only you know. No one else can
| access or read this data.
|
| So messages are opaque but photos are not?
| kevinherron wrote:
| Read the next line: "These features and their data are
| transmitted and stored in iCloud using end-to-end
| encryption:"
|
| Notice the absence of photos from the list that follows?
|
| > So messages are opaque but photos are not?
|
| Yes. Unless you use iCloud backup, in which case your key
| is included in the backup and technically even your
| messages could be accessed.
| atlgator wrote:
| But it can still locate the originator which is the problem.
| vletal wrote:
| You can do stuff like listing top K trending anti gov memes
| past day and blacklisting them. Thats definitely a content.
| rgovostes wrote:
| You cannot, because the models are shipped the OS image.
| Apple would have to publish a new OS update every day new
| memes come out.
|
| If you're talking about photo library scanning, people
| would have to be saving the memes to their photo libraries,
| cloud syncing would have to be on, and they would have to
| save lots of them.
|
| If you're talking about using the iMessage content
| scanning, this only applies to under-13 accounts, and it
| only notifies parents associated with the account.
|
| If you are suggesting that Apple would be compelled to
| modify either of these to do something else, why would they
| bother with these, which are built with many constraints
| and multiple levels of failsafes, rather than just
| implement a new feature? Governments could ban end-to-end
| encrypted chat like iMessage. They could demand the next
| iOS update uploads your saved social media passwords to
| them. It's not very plausible that anyone would want to
| modify _these_ features rather than just implement
| something more direct and harder to evade.
|
| Governments could make these demands at any point. They do
| not need to wait for this feature and then spring into
| action. The Chinese government did regarding where iCloud
| data is hosted for its citizens. The US government did
| regarding the San Bernardino iPhone case, and Apple fought
| back very publicly in court that any kind of modification
| of the OS was unacceptable.
| vletal wrote:
| sed -i "s/day/month/" parent
| Jcowell wrote:
| The image saving is migrated by using apps that autosave
| received images by default. I.E WhatsApp.
| robertoandred wrote:
| You think Apple will have people downloading an OS update
| every day?
| leoh wrote:
| You think old memes are never reused?
| veeti wrote:
| What are you saying? That regular iOS updates will never
| update the database used for matching?
| gjsman-1000 wrote:
| The reason I think Apple went forward with it though is that,
| _from their perspective,_ it 's not like they are building a
| new tool for surveillance. It doesn't take many brain cells for
| a lawmaker to realize that they could mandate pre-screening of
| content.
|
| From Apple's perspective... for authoritarian governments like
| China or India... they are already able to mandate it and are
| likely to. So they shouldn't be factored into the "CSAM
| Scanning could be abused!" argument because it was already
| happening and going to happen, whether the tool exists or not.
|
| In which case, releasing the CSAM tool has only benefits for
| the abused and doesn't make a difference in preventing
| surveillance and privacy invasions because it was already going
| to occur. A cynical view but a possibility.
___________________________________________________________________
(page generated 2021-08-20 23:02 UTC)