https://lwn.net/SubscriberLink/865756/a7f67aaa8ea52862/ LWN.net Logo LWN .net News from the source LWN * Content + Weekly Edition + Archives + Search + Kernel + Security + Distributions + Events calendar + Unread comments + ------------------------------------------------------------- + LWN FAQ + Write for us * Edition + Return to the Front page User: [ ] Password: [ ] [Log in] | [Subscribe] | [Register] Subscribe / Log in / New account Scanning "private" content [LWN subscriber-only content] Welcome to LWN.net The following subscription-only content has been made available to you by an LWN subscriber. Thousands of subscribers depend on LWN for the best news from the Linux and free software communities. If you enjoy this article, please consider subscribing to LWN. Thank you for visiting LWN.net! By Jake Edge August 11, 2021 Child pornography and other types of sexual abuse of children are unquestionably heinous crimes; those who participate in them should be caught and severely punished. But some recent efforts to combat these scourges have gone a good ways down the path toward a kind of AI-driven digital panopticon that will invade the privacy of everyone in order to try to catch people who are violating laws prohibiting those activities. It is thus no surprise that privacy advocates are up in arms about an Apple plan to scan iPhone messages and an EU measure to allow companies to scan private messages, both looking for "child sexual abuse material" (CSAM). As with many things of this nature, there are concerns about the collateral damage that these efforts will cause--not to mention the slippery slope that is being created. iPhone scanning Apple's move to scan iPhone data has received more press. It would check for image hashes that match known CSAM material; the database of hashes will be provided by the National Center for Missing and Exploited Children (NCMEC). It will also scan photos that are sent or received in its messaging app to try to detect sexually explicit photos to or from children's phones. Both of those scans will be done on the user's phone, which will effectively break the end-to-end encryption that Apple has touted for its messaging app over the years. Intercepted messages that seem to be of a sexual nature, or photos that include nudity, will result in a variety of interventions, such as blurring the photo or warning about the content of the message. Those warnings will also indicate that the user's parents will be informed; the feature is only targeted at phones that are designated as being for a child--at least for now. The general photo scanning using the NCMEC hashes has a number of safeguards to try to prevent false positives; according to Apple, it "ensures less than a one in one trillion chance per year of incorrectly flagging a given account ". Hash matches are reported to Apple, but encrypted as "safety vouchers" that can only be opened after some number of matching messages are found: Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user's account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated. The Siri voice-assistant and the iPhone Search feature are also being updated to check for CSAM-related queries, routing requests for help reporting abuse to the appropriate resources, while blocking CSAM-oriented searches: Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue. The Electronic Frontier Foundation (EFF) is, unsurprisingly, disappointed with Apple's plan: We've said it before, and we'll say it again now: it's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change. The EFF post goes on to point to recent laws passed in some countries that could use the Apple backdoor to screen for other types of content (e.g. homosexual, satirical, or protest content). Apple could be coerced or forced into extending the CSAM scanning well beyond that fairly limited scope. In fact, this kind of expansion has already happened to a certain extent: We've already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of "terrorist" content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it's therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as "terrorism," including documentation of violence and repression, counterspeech, art, and satire. For its part, Apple has released a FAQ that says it will refuse any demands by governments to expand the photo scanning beyond CSAM material. There is, of course, no technical way to ensure that does not happen. Apple has bowed to government pressure in the past, making some leery of the company's assurances. As Nadim Kobeissi put it: Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure. What happens when local regulation mandates that messages be scanned for homosexuality? It is interesting to note that only a few years ago, Apple itself was making arguments against backdoors with many of the same points that the EFF and many other organizations and individuals have made. As Jonathan Mayer pointed out: "Just 5 years ago, Apple swore in court filings that if it built a capability to access encrypted data, that capability would be used far beyond its original context." EU scanning Meanwhile in the EU, where data privacy is supposed to reign supreme, the "ePrivacy derogation" is potentially even more problematic. It allows communication-service providers to "monitor interpersonal communications on a voluntary basis with the purpose of detecting and reporting material depicting sexual abuse of minors or attempts to groom children". It is, of course, not a huge leap from "voluntary" to "mandatory". As might be guessed, the scanning will not be done directly by humans--problematic in its own right--but by computers: The scanning of private conversations will be conducted through automated content recognition tools, powered by artificial intelligence, but under human oversight. Service providers will also be able to use anti-grooming technologies, following consultation with data protection authorities. The EU General Data Protection Regulation (GDPR) is a sweeping framework for protecting personal data, but since the start of 2021 it no longer covers messaging services. That kind of communication falls under the ePrivacy directive instead, thus the change allowing scanning is a derogation to it. Patrick Breyer, member of the EU Parliament, has criticized the derogation on a number of grounds. He lists a number of different problems with it, including: + All of your chat conversations and emails will be automatically searched for suspicious content. Nothing remains confidential or secret. There is no requirement of a court order or an initial suspicion for searching your messages. It occurs always and automatically. + If an algorithms classifies the content of a message as suspicious, your private or intimate photos may be viewed by staff and contractors of international corporations and police authorities. Also your private nude photos may be looked at by people not known to you, in whose hands your photos are not safe. [...] + You can falsely be reported and investigated for allegedly disseminating child sexual exploitation material. Messaging and chat control algorithms are known to flag completely legal vacation photos of children on a beach, for example. According to Swiss federal police authorities, 86% of all machine-generated reports turn out to be without merit. 40% of all criminal investigation procedures initiated in Germany for "child pornography" target minors. As Breyer pointed out, there is already proposed legislation to make the scanning mandatory, which would break end-to-end encryption: " Previously secure end-to-end encrypted messenger services such as Whatsapp or Signal would be forced to install a backdoor." "Safety" vs. privacy Both of these plans seem well-intentioned, but they are also incredibly dangerous to privacy. The cry of "protect the children" is a potent one--rightly so--but there also need to be checks and balances or the risks to both children and adults are far too high. Various opponents (who were derided as "the screeching voices of the minority " by the NCMEC in a memo to Apple employees) have noted that these kinds of measures can actually harm the victims of these crimes. In addition, they presuppose that everyone is guilty, without the need for warrants or the like, and turn over personal data to companies and other organizations before law enforcement is even in the picture. As with many problems in the world today, the sexual abuse of children seems an insurmountable one, which makes almost any measure that looks likely to help quite attractive. But throwing out the privacy of our communications is not a sensible--or even particularly effective--approach. These systems are likely to be swamped with reports of completely unrelated activity or of behavior (e.g. minors "sexting" with each other) that is better handled in other ways. In particular, Breyer has suggestions for ways to protect children more effectively: The right way to address the problem is police work and strengthening law enforcement capacities, including (online) undercover investigations, with enough resources and expertise to infiltrate the networks where child sexual abuse material is distributed. We need better staffed and more specialized police and judicial authorities, strengthening of resources for investigation and prosecution, prevention, support, training and funding support services for victims. There have long been attacks against encryption in various forms, going back to (at least) the crypto wars in the 1990s. To those of us who lived through those times, all of this looks an awful lot like a step back toward the days of the Clipper chip, with its legally mandated crypto backdoor, and other efforts of that sort. Legislators and well-meaning organizations are seemingly unable to recognize that a backdoor is always an avenue for privacy abuse of various kinds. If it requires screeching to try to make that point--again--so be it. [Send a free link] ----------------------------------------- (Log in to post comments) Scanning "private" content Posted Aug 11, 2021 20:51 UTC (Wed) by Cyberax ( supporter , # 52523) [Link] And don't forget, with iPhone you're out of control. You can't modify the OS or even to install applications that are not blessed by Apple. How long do you think we'll have wait until this scanning feature becomes mandatory for all applications? [Reply to this comment] Scanning "private" content Posted Aug 11, 2021 21:25 UTC (Wed) by pebolle (subscriber, #35204) [ Link] > [X] and other types of [X] are unquestionably heinous crimes; those who participate in them should be caught and severely punished. Note "unquestionably", "heinous crimes", "should be caught" and "severely punished". X being one of the four horsemen of the of the information apocalypse. Say: terrorists, pedophiles, drug dealers, and money launderers. Bruce Schneier suggested: terrorists, drug dealers, kidnappers, and child pornographers. Any other suggestions? Anyhow: why bother continuing after that sentence? For abstractions like privacy, due process and rule of law? [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 7:12 UTC (Thu) by smurf (subscriber, #17840) [ Link] That. But even if this idea should stay limited to CSAM, which it most likely won't be if history is any indication, and even if due process and whatnot is followed when my phone sics the police on the content of my phone, which is unlikely to happen either, there are problems galore here. For one, define "child". The idea that everybody a day shy of their 18th-or-whichever birthday should have no rights to their own body, or to decide what they and/or somebody else are allowed to do with it, is certifiable, but all too real in some parts of the world. Thus "abuse" can be, and sometimes is, a catch-all for basically anything, given that consent is presumed to be impossible. Similarly, define "sex", given that some puritan busybodies get off on their righteous indignation when they see a perfectly innocent picture of a nude child at the beach, let alone a teen on their way through puberty. Also, possession is a crime. Nice. So, somebody sends me a disgusting pic via Whatsapp? it stays in my media folder pretty much forever, with no visible clue that it's not mine. That brainless scanner won't know either. Next, how is that thing even trained? Will it be confronted with a ton of innocent, or not-so-innocent-but-legal-in-this-context, control images? The set of "bad" images its going to be trained on cannot be made public for obvious reasons, but it's Apple's job of coming up with a way of verifying the accuracy of this. In fact, did Apple even *think* of the implications of unleashing this on their users? or is it just a placate-the-legal-beagles move? [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 8:05 UTC (Thu) by taladar (subscriber, #68407) [ Link] There is also the issue of verification. Say some government entity sends the service provider who does the scanning a set of hashes. How do they know it is hashes of CSAM? It might also be hashes of pictures of political activists protesting that government or pictures of embarrassing programs that government wants to keep secret by finding journalists who get their hands on them before they have a chance to publish. To me this focus on pictures and videos seems rather telling in any case. Shouldn't the primary focus be to prevent the children from being hurt? Wouldn't that be best achieved by monitoring parents, priests, youth group leaders, sports coaches and similar people who actually have regular access to children and have historically been the main group of perpetrators? Wouldn't it make more sense to limit their right to privacy before limiting everyone's (not that I think limiting anyone's right to privacy is a good idea)? On the other hand if your primary goal is surveillance and children are just the excuse then it makes much more sense to focus on the digital realm over the physical environment of the children. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 21:53 UTC (Thu) by gmaxwell (guest, #30048) [Link ] They put a tremendous amount of engineering into it to cryptographically shield themselves and their list providers from accountability, too. I find this extremely concerning. The obvious construction for such a system would simply deliver to the DRM-locked-down devices a list of hashes to check against and self-report. But that construction would make it possible for people to perform a web crawl and check it against the list, which would have a fighting chance of exposing instances of inappropriately included popular images. It would make it much harder to secretly use the system to target particular religions, ethnicity, or political ideologies... But, instead, the use a complex and computationally expensive private set intersection to make sure that the users can't learn *anything* about the list they're being checked against except an upper bound on its size. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 22:55 UTC (Thu) by tialaramex (subscriber, # 21167) [Link] In fact Apple trust "a database of known CSAM image hashes provided by NCMEC and other child safety organizations". While NCMEC gets a name check, "other child safety organizations" is entirely for Apple to define. Nobody _even at Apple_ has an insight into what the actual images are, and if they allow some unnamed "child safety organization" to submit one or more hashes that organization gets indirect access to Apple's devices and users. The "verification" step also is concerning. These are perceptual hashes, meaning they recognise images that are similar to a target in a way determined by an algorithm, likely a proprietary algorithm. So, unavoidably there can be false positives and you must actually verify matches. Apple of course doesn't have the original images, and certainly doesn't want to hire people to look at probably illegal images, so instead this is done by sending the matching image off to the same organizations which generated the hashes... The effect is that it may be possible to provide Apple with hashes that match most photographs of a dissident's face, and then you'll be sent any new photographs of that dissident for "verification". You can confidently inform Apple that these were false positives, not CSAM, and needn't be blocked, and of course Apple have no way to to determine whether you kept a copy of the images or acted on what you saw... It's probably more reasonable to think of Apple and other proponents of these systems as useful idiots rather than part of a conspiracy. They feel they ought to do something, and this is something, so they feel they ought to do this. They believe that because they have good intentions, the arguments against this system are null - they know what they're doing is good. The problem is that they in fact cannot know whether what they're doing is good, the system intentionally blinds them and that ought to be a red flag. Even the wider campaigns on this topic are most likely from useful idiots rather than deliberate enablers. Their idea is that if we can stop copies of an image from existing the thing depicted in the image is erased too, this is an obviously false belief about the universe but it's actually really commonly wished for. In many countries some such CSAM databases include images of events that never took place, or of children who never existed, they're of no value for prosecuting a hypothetical perpetrator of sexual assault because the offence never actually occurred - it'd be like showing video of a popular "death match" first person multi-player video game as evidence for a murder charge - but whereas you won't anywhere with that, prosecuting possession of a copy of these image really means somebody goes to jail to "protect" the imaginary children from imaginary abuse. Like prosecuting vagrancy this seems like an obviously bad idea but is really popular. [Reply to this comment] Scanning "private" content Posted Aug 11, 2021 21:28 UTC (Wed) by philipstorry (subscriber, # 45926) [Link] I get the feeling that we're going to discover a whole heap of edge cases from this. For example, most sexual predators are part of the community or even extended family. What if they're adding photos from social media to their photo sets? Suddenly someone sends an old picture of their kid at the beach to someone, and BANG! Hash match, and you've got a problem. In some justice systems this will be handled well, in others it will be handled terribly. Can you imagine people being told "We know you're abusing that child, Apple tells us so. Take this guilty plea and it'll be easier for you." Because I really wouldn't bet against that happening. This has the potential to ruin lives in new and awful ways. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 9:08 UTC (Thu) by nim-nim (subscriber, #34454) [ Link] The less democratic a state, the less it cares about side effects and treating everyone fairly. An authoritarian state will much prefer the ability to repress hard a minority of opponents, over the ability to detect and police what it considers petty crimes. And, in fact, being able to repress at will relies on the existence of a large amount of unprosecuted offenses (real or false positives). This way you have ready-to-be-opened cases against a large part of the population. So, depending on your objectives, lack of fairness and huge amount of false positives are not a bug but a feature. Finally the nice thing about algorithms is that they do not have a conscience and won't protest overreaching. You only need to care about the subset of people involved into prosecutions. With traditional manual data collection any of the army or data collecting bureaucrats can turn whistleblower. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 13:30 UTC (Thu) by ldearquer (subscriber, # 137451) [Link] >> The less democratic a state, the less it cares about side effects and treating everyone fairly. Similar phrasing could be utilized for a careless state, which applies populistic measures, being welcome by a majority, regardless of the harm caused to smaller groups, or groups without direct democratic representation (as children are). No one asked children if they were ok with all the technology changes of the last decades, considering they could arguably make them more vulnerable. Privacy is a good thing, but it is not the absolute most important asset of humanity. Maybe it is in the top 10, but not in the top 5. I would give away my online privacy any day *if* that would reduce children abuse. Note this is a generic take on the problem, not a defense of this specific way of doing things. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 14:35 UTC (Thu) by nim-nim (subscriber, #34454) [ Link] But can a careless state be considered democratic ? The sole advantage of democraties over other regimes is that they are supposed to care about everyone equally (one head = one vote). It seems to me that assigning unequal political weights to citizens (caring less about some than others, breaking the one head = one vote rule) is a fast path to something else. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 17:18 UTC (Thu) by ldearquer (subscriber, # 137451) [Link] I hate to disagree with such a naive vision of democracy. A state is democratic as long as the government is elected by the people or their representatives. That's it. Greeks already realized long ago how things could still go wrong. Also note how the one head=one vote still excludes children. Not that it is easy to get around (direct children votes could be a disaster), but IMO this has an effect on politics. [Reply to this comment] Democracy Posted Aug 12, 2021 19:33 UTC (Thu) by tialaramex (subscriber, # 21167) [Link] There's a long history of expanding the franchise with people incorrectly predicting drastic consequences if this is attempted and then nothing interesting happening. Just 200 years ago, England and Wales had a completely baroque electoral system, where in one place the Member might be elected by more or less anybody who had an opinion (conventionally not by women but it wasn't technically illegal, records do not indicate any women _trying_ to vote under this system) and in others only by members of a group actually _paid_ by the current Member (which is scarcely any sort of democracy at all), each district could have its own rules and most of them did. Great Reform shook that up, introducing a baseline that would allow a typical middle aged skilled worker or professional of the era to meet requirements to register as a voter -- but it didn't result in a great upheaval, and over the rest of that century if anything the aristocracy actually expanded their control, because it turns out rich merchants can buy 100 votes in a district where only 150 people can vote anyway, but greatly expanding the franchise makes this game unaffordable for them - whereas if people vote for you, Sir Whatever just because they always voted for you, or your father (also Sir Whatever) before you, that's not difficult to maintain when more of them qualify to vote. 100 years ago, a lot of women could vote, though far more men (all men over 21), the introduction of widespread voting by women was seen as potentially very disruptive. In fact, the women proved no more spirited than the men, and more or less the same people (overwhelmingly men) were elected. Even when women were given the exact same rights (all women over 21 too) it made no major difference. In the middle of the 20th century the UK got around to abolishing plural voting (yes, right up until 1948 "one man, one vote" literally wasn't the rule although plural voting had been somewhat neutered after Great Reform) and only in 1969 did they lower the age to 18. The newly enfranchised teenagers did not in fact tear down grown-up government, and things still continued more or less as before. Among obvious groups that still ought to be enfranchised: Convicted criminals from prison -- at least all those convicted of crimes unrelated to the functioning of democracy, and frankly probably those too unless you're bad at prisons and can't keep them from tampering with the vote from inside a prison; Children -- certainly all teenagers and there's no particular reason not to enfranchise any child that seems to actually have a preference, their preferences certainly can't be less _informed_ than those of adult voters so why not? Overall, given that the only function of democracy is to avoid the alternative (violent transitions of power) why shouldn't we let toddlers help if they want to? [Reply to this comment] Scanning "private" content Posted Aug 11, 2021 21:39 UTC (Wed) by dskoll (subscriber, #1630) [ Link] It seems to me this violates the prohibition against unreasonable search. Even if it's Apple doing the searching, it's effectively been deputized by the government to do that. Anyway, welcome to dystopia. As a technologist, I'm really starting to think that (despite his being a mass murderer) the Unabomber might have had some good points in his manifesto. Technology is looking more and more negative. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 21:40 UTC (Thu) by gmaxwell (guest, #30048) [Link ] Because the scanning is done by Apple for their own free will and commercial benefit and not at the behest of the government, they've not been deputized. They gain the ability to do so via the users consensual interaction with them. The courts have been unequiviable-- if the government mandated or even incentivized this scanning it would require a warrant. The reason that the fact that they're using a (effectively) government provided list doesn't cause a problem is because they have a human manually review the content (which is why Apple includes that step-- without it the 'search' wouldn't happen until the NCMEC opened the files, which would require a warrant). [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 0:52 UTC (Thu) by jkingweb (subscriber, #113039) [Link] Software installed on my property which acts as an agent for someone else and could land me in jail? Sounds like malware to me... We should not tolerate software that acts contrary to the interest of the owner. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 20:36 UTC (Thu) by LtWorf (subscriber, #124958) [ Link] > We should not tolerate software that acts contrary to the interest of the owner. That's why we need free software. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 21:44 UTC (Thu) by pizza (subscriber, #46) [Link] > We should not tolerate software that acts contrary to the interest of the owner. I'd argue that the software of which you speak is very much acting in the interest of the owner. The user is not the owner. Meanwhile, most folks don't actually care at all about "software" in of itself -- because the software is useless without a service running on someone else's computers. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 0:53 UTC (Thu) by azure (subscriber, #112903) [ Link] Ever since this came out, I've been suspecting this is Apple's pushback against anti-monopoly interventions. If users can sideload whatever apps they like, won't they be able to bypass whatever scanning Apple puts in place? [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 1:18 UTC (Thu) by wahern (subscriber, #37304) [ Link] Similarly but maybe slightly more plausible is that Apple was facing down an imminent demand by government(s) to weaken iPhone security. Apple unilaterally proffers this as a compromise. (Not singularly, though; Apple has had to acquiesce in other areas, like encrypted backups.) It pushes people who exchange child porn onto other, less secure platforms, making it easier for the government to prosecute marginally more people. (Ditto for some other criminal behaviors if they trust Apple less.) Far more importantly it disarms the government of the cudgel of child porn in its political campaign to weaken iPhone security. And for the same reasons child porn is such a great cudgel--i.e. "I've got nothing to hide" is more persuasive when the topic is child porn--Apple likely considers the cost/benefit to its credibility acceptable, especially presuming the alternative Apple faced was more direct and substantial weakening of iPhone security (e.g. key escrow) that lacked inherent subject matter limitations. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 20:44 UTC (Thu) by LtWorf (subscriber, #124958) [ Link] On a false positive (say of my kid at the beach) an apple employee will view the content. I don't think it's anywhere near acceptable for this to happen. And for all I know, a person seeking a job to look at pics of kids might have an interest in choosing the profession... In my opinion they are starting with this, since most people see it as an acceptable measure to protect children, but will soon after move to look for copyrighted content. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 6:56 UTC (Thu) by fenncruz (subscriber, #81417) [ Link] Have Apple just created a way to DOS apple users? A malicious user sends an apple user enough abuse images to trigger the threshold. The account gets locked until a human gets through their backlog of cases (is reviewing child abuse images really a job many people want or would stay in for a long time?). Account is locked for maybe days or weeks untill the appeal is heard and decided on. [Reply to this comment] Cloud = big brother Posted Aug 12, 2021 8:12 UTC (Thu) by nim-nim (subscriber, #34454) [ Link] Just a few more steps and global Big Brother implementation will be complete. It will surprise no one except Americans with their blind faith in private companies that the end result of giving data keys to an oligopoly of cloud giants, results first into processing of this data for those giants own needs, and second to the extension of this processing for all kinds of public and private third party demands. The data trove is just too tempting and there are too few people needing to be leaned on to breach all safeguards. It has been a long way into the making, by refusing to break out companies that were overreaching, helping them marginalize independent data clients (Firefox), own the data pipes (https hardening), take over the software market (open source vs free software ie letting the corporations with the most money take over dev projects and open core them). Unfortunately at this stage nothing short of a huge scandal or the USA losing a major war can reverse the process. Delegating surveillance to a few giants is just too convenient for the US government and it will shield them from any serious challenge. [Reply to this comment] Cloud = big brother Posted Aug 12, 2021 11:05 UTC (Thu) by alfille (subscriber, #1631) [ Link] I'm not sure I understand nim-nim's point entirely. Trust in big companies is not exclusive to the USA, nor are other vectors of intrusive monitoring (governments, cults, schools) any better. In fact the priority of personal freedom is being eroded globally. [Reply to this comment] Cloud = big brother Posted Aug 12, 2021 13:07 UTC (Thu) by nim-nim (subscriber, #34454) [ Link] The USA are weird insamuch they are ultra-sensitive to anything done by the state and ultra-tolerant of anything done by private companies. In other parts of the world the same rules apply to both private and public sectors. [Reply to this comment] Cloud = big brother Posted Aug 12, 2021 16:27 UTC (Thu) by marcH (subscriber, #57642) [ Link] > It will surprise no one except Americans with their blind faith in private companies ... I agree many Americans distrust "government" more than private monopolies, which seems indeed naive / ideological. Democratic governments tend to have at least the Hanlon's razor on their side whereas private monopolies are much more efficient at screwing us. However: - It does not follow that Americans have a blind faith in private companies! Think: "lesser of two evils". - In this particular case it's really the combination of governments _and_ monopolies at work. [Reply to this comment] e2e encryption Posted Aug 12, 2021 11:15 UTC (Thu) by zdzichu (subscriber, #17118) [ Link] > both of those scans will be done on the user's phone, which will effectively break the end-to-end encryption If the scan is done on the user's phone, it *does not* break end-to-end encryption, as the phone is one of the "ends". On the other hand, Apple collecting CP, even for "review purposes", has dubious legal status. [Reply to this comment] e2e encryption Posted Aug 12, 2021 21:54 UTC (Thu) by gmaxwell (guest, #30048) [Link ] That's a little like arguing that e2e encryption wouldn't be broken by requiring everyone only use the encryption key "password". :) It's technically true, but it undermines the purpose of the encryption. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 11:37 UTC (Thu) by sam.thursfield (subscriber, # 94496) [Link] Seems like a good time to run an encrypted email service based in somewhere not subject to EU law, like Switzerland for example. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 12:14 UTC (Thu) by smurf (subscriber, #17840) [ Link] How does that help when the scanner is embedded in your phone's OS? [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 11:37 UTC (Thu) by kleptog (subscriber, #1183) [ Link] > According to Swiss federal police authorities, 86% of all machine-generated reports turn out to be without merit All issues aside, that's actually way better than I expected. Any fraud analyst would be quite happy with that kind of accuracy given the volume of data involved. [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 18:23 UTC (Thu) by tchernobog (subscriber, # 73595) [Link] So, everything I do as a law-abiding citizen in the EU can be viewed by strangers and I can be flagged from an AI to disseminate child pornography because my 4 years old goes to the beach naked. And all my communication software will need to be adapted to permit this, including my personal mail server (since I don't use GMail). Whereas, a criminal *really* disseminating child pornography will use secure end to end encryption since they don't care about "the law". Doesn't anybody else in the EU parliament see the problem with this? [Reply to this comment] Scanning "private" content Posted Aug 12, 2021 19:19 UTC (Thu) by albertgasset (subscriber, # 74366) [Link] > The EU General Data Protection Regulation (GDPR) is a sweeping framework for protecting personal data, but since the start of 2021 it no longer covers messaging services. That kind of communication falls under the ePrivacy directive instead, thus the change allowing scanning is a derogation to it. I think this is not accurate. The linked "ePrivacy Regulation" directive is a proposal that was meant to be approved at the same time as the GDPR but it is still being discussed. The directive that has been (partially) derogated is the "Privacy and Electronic Communications Directive" (2002/58/EC) which is also known as the "ePrivacy Directive". But the GDPR (2016/679) still applies to messaging services: > (12) This Regulation provides for a temporary derogation from Articles 5(1) and 6(1) of Directive 2002/58/EC, which protect the confidentiality of communications and traffic data. The voluntary use by providers of technologies for the processing of personal and other data to the extent necessary to detect online child sexual abuse on their services and report it and to remove online child sexual abuse material from their services falls within the scope of the derogation provided for by this Regulation provided that such use complies with the conditions set out in this Regulation and is therefore subject to the safeguards and conditions set out in Regulation (EU) 2016/679. See: https://www.europarl.europa.eu/doceo/document/TA-9-2021-0... [Reply to this comment] Copyright (c) 2021, Eklektix, Inc. Comments and public postings are copyrighted by their creators. Linux is a registered trademark of Linus Torvalds