[HN Gopher] Microsoft only lets you opt out of AI photo scanning...
___________________________________________________________________
Microsoft only lets you opt out of AI photo scanning 3x a year
Author : dmitrygr
Score : 265 points
Date : 2025-10-11 18:36 UTC (4 hours ago)
(HTM) web link (hardware.slashdot.org)
(TXT) w3m dump (hardware.slashdot.org)
| _wire_ wrote:
| Crossposting slashdot?
|
| Heaven forfend!
| dmitrygr wrote:
| They are the ones who did this interview
| leakycap wrote:
| Microsoft: forces OneDrive on users via dark pattern dialogs that
| many users just accept
|
| Users: save files "on their PC" (they think)
|
| Microsoft: Rolls out AI photo-scanning feature to unknowing users
| intending to learn something.
|
| Users: WTF? And there are rules on turning it on and off?
|
| Microsoft: We have nothing more to share at this time.
|
| Favorite quote from the article:
|
| > [Microsoft's publicist chose not to answer this question.]
| netsharc wrote:
| Tell them "you may only refuse to answer this question 3 times
| a year".
| j45 wrote:
| It's totally worth self hosting files, it's gotten much better.
| hshdhdhehd wrote:
| Almost feel like we are getting to class action or antitrust
| when you connect the dots. Almost all PCs come with Windows.
| Defacto you need to create a M$ account to use Windows locally.
| They opt you into one drive by default. They sync your docs by
| default. They upload all your photos into AI by default.
| rf15 wrote:
| > and follow Microsoft's compliance with General Data Protection
| Regulation
|
| Not in a million years. See you in court. As often, just because
| a press statement says something, it's not necessarily true and
| maybe only used to defuse public perception.
| thrownfjfkfmofn wrote:
| How is this not a revenge porn or something? If I upload
| sensitive photos somewhere, it is 5 years prison sentence! CEO of
| Microsoft can do that billion times!
| themafia wrote:
| "You can only turn off this setting 3 times a year."
|
| Astonishing. They clearly feel their users have no choice but to
| accept this onerous and ridiculous requirement. As if users
| wouldn't understand that they'd have to go way out of their way
| to write the code which enforces this outcome. All for a feature
| which provides me dubious benefit. I know who the people in my
| photographs are. Why is Microsoft so eager to also be able to
| know this?
|
| Privacy legislation is clearly lacking. This type of action
| should bring the hammer down swiftly and soundly upon these gross
| and inappropriate corporate decision makers. Microsoft has needed
| that hammer blow for quite some time now. This should make that
| obvious. I guess I'll hold my breath while I see how Congress
| responds.
| mihaaly wrote:
| I assume this would be a ... call it feature for now, so a
| feature not available in the EU due to GDPR violations.
| jaredsohn wrote:
| > I know who the people in my photographs are. Why is Microsoft
| so eager to also be able to know this?
|
| Presumably it can be used for filtering as well - find me all
| pictures of me with my dad, etc.
| wkat4242 wrote:
| Sure but if it was for your benefit, not theirs, they
| wouldn't force it on you.
| 14 wrote:
| My initial thoughts were so they could scan for csam while
| pretending as if users have a choice to not have their privacy
| violated.
| bayindirh wrote:
| From my understanding, CSAM scanning is always considered a
| separate, always on and mandatory subsystem in any cloud
| storage system.
| odo1242 wrote:
| Yes, any non E2EE cloud storage system has strict scanning
| for CSAM. And it's based on perceptual hashes, not AI
| (because AI systems can be tricked with normal-looking
| adversarial images pretty easily)
| heavyset_go wrote:
| I built a similar photo ID system, not for this purpose
| or content, and the idea of platforms using perceptual
| hashes to potentially ruin people's lives is horrifying.
|
| Depending on the algorithm and parameters, you can easily
| get a scary amount of false positives, especially using
| algorithms that shrink images during hashing, which is a
| lot of them.
| zelphirkalt wrote:
| Actually, most users probably don't understand, that this
| ridiculous policy is more effort to implement. They just
| blindly follow whatever MS prescribes and have long given up on
| making any sense of the digital world.
| antegamisou wrote:
| I agree with you but there's nothing astonishing about any of
| this unfortunately, it was bound to happen. Almost all of
| cautionary statements about AI abuse fall on deaf ears of HN's
| overenthusiastic and ill-informed rabble, stultified by YC tech
| lobbyists.
| tomatotomato37 wrote:
| Worst part about it was all the people fretting on about
| ridiculous threats like the chatbot turning into skynet
| sucked the oxygen out of the room for the more realistic
| corporate threats
| jonas21 wrote:
| I don't really see the issue. If you don't want the face
| recognition feature, then you'll turn it off once, and that's
| that. Maybe if you're unsure, you might turn it off, and then
| back on, and then back off again. But what's the use case where
| you'd want to do this more than 3x per year?
|
| Presumably, it's somewhat expensive to run face recognition on
| all of your photos. When you turn it off, they have to throw away
| the index (they'd better be doing this for privacy reasons), and
| then rebuild it from scratch when you turn the feature on again.
| gus_massa wrote:
| How hard it to turn it on? Does it show a confirmation message?
|
| My wife has a phone with a button on the side that opens the
| microphone to ask questions to Google. I guess 90% of the
| audios they get are " _How the /&%/&#"% do I close this
| )(&(/&(%)?????!?!??_"
| xdfgh1112 wrote:
| I bought a new Motorola phone and there are no less than
| three ways to open Google assistant (side button, hold home
| button, swipe from corner). Took me about 10 seconds before I
| triggered it unintentionally and quickly figured out how to
| disable all of them...
| ArnoVW wrote:
| To prevent you from having the option to temporarily disable
| it, so you have to choose between privacy and the supposed
| utility
| Barbing wrote:
| Right, while I understand the potential compute cost, it
| would be like the iPhone restricting the number of times you
| could use "allow once" for location permissions.
| JumpCrisscross wrote:
| > _When you turn it off, they have to throw away the index
| (they 'd better be doing this for privacy reasons), and then
| rebuild it from scratch when you turn the feature on again_
|
| This is probably the case. But Redmond being Redmond, they put
| their foot in their mouth by saying "you can only turn _off_
| this setting 3 times a year " (emphasis mine).
| NoLinkToMe wrote:
| Agreed, in practice for me there's no real issue.
|
| But that's not necessarily true for everyone. And it doesn't
| need to be this way, either.
|
| For starters I think it'd help if we understood why they do
| this. I'm sure there's a cost to the compute MS spends on
| AI'ing all your photos, turning it off under privacy rules
| means you need to throw away that compute. And turning it back
| on creates an additional cost for MS, that they've already
| spent for nothing. Limiting that makes sense.
|
| What doesn't make sense is that I'd expect virtually nobody to
| turn it on and off over and over again, beyond 3 times, to the
| point that cost increases by more than a rounding error... like
| what type of user would do that, _and_ why would that type of
| user not be exceedingly rare?
|
| And even in that case, it'd make more sense to do it the other
| way around: you can turn on the feature 3 times per year, and
| off anytime. i.e. if you abuse it, you lose out on the feature,
| not your privacy.
|
| So I think it is an issue that could and should be quickly
| solved.
| a2128 wrote:
| If this is the true reason, then they have made some poor
| decisions throughout that still deserve criticism. Firstly by
| restricting the number of times you can turn it _off_ rather
| than _on_, secondly by not explaining the reason in the linked
| pages, and thirdly by having their publicist completely refuse
| to say a word on the matter.
|
| In fact, if you follow the linked page, you'll find a
| screenshot showing it was originally worded differently, "You
| can only change this setting 3 times a year" dating all the way
| back to 2023. So at some point someone made a conscious
| decision to change the wording to restrict the number of times
| you can turn it _off_
| noir_lord wrote:
| > If you don't want the face recognition feature, then you'll
| turn it off once.
|
| The issue is that is a feature that 100% should in any sane
| world be opt _in_ - not _opt out_.
|
| Microsoft privacy settings are a case of - "It was on display
| in the bottom of a locked filing cabinet stuck in a disused
| lavatory with a sign on the door saying 'Beware of the
| Leopard."
| yuvalr1 wrote:
| Well, sometimes Microsoft decides to change your settings back.
| This has happened to me very frequently after installing
| Windows updates. I remember finding myself turning the same
| settings off time and again.
| netsharc wrote:
| The "fuck you, user!" behavior of software companies now
| means there's no more "No", only "Maybe later". Every time I
| update Google Photos, it shows me the screen that "Photos
| backups are not turned on! Turn on now?" (because they want
| to upsell their paid storage space option).
| jtmarl1n wrote:
| The lack of a true "no" option and only "maybe later"
| infuriates me.
| kypro wrote:
| Assuming this reasoning is accurate, why not just silently
| throw a rate limit error and simply not reenable it if it's
| repeatedly switched on and off?
| netsharc wrote:
| I wonder if it's possible to encrypt the index with a key
| that's copied to the user's device, and if the user wants to
| turn off this setting, delete the key on the server. When they
| want to turn it back on, the device uploads the key. Yes, the
| key might end up gone if there's a reinstall, etc.
|
| If the user leaves it off for a year, then delete the encrypted
| index from the server...
| bayindirh wrote:
| Even KDE's Digikam can run "somewhat expensive" algorithms on
| your photos without melting your PC and making you wait a year
| to recognize and label faces.
|
| Even my 10(?) year old iPhone X can do facial recognition and
| memory extraction on device while charging.
|
| My Sony A7-III can detect faces in real time, and discriminate
| it from 5 registered faces to do focus prioritization the
| moment I half-press the shutter.
|
| That thing will take mere minutes on Azure when batched and fed
| through GPUs.
|
| If my hunch is right, the option will have a "disable AI use
| for x months" slider and will turn itself on without letting
| you know. So you can't opt out of it completely, ever.
| like_any_other wrote:
| So why not limit how many times you can turn it _on_ , instead
| of off?
|
| We all know why.
| wzdd wrote:
| "When this feature is disabled, facial recognition will be
| disabled immediately and existing recognition data will be
| purged within 60 days". Then you don't need a creepy message.
| Okay, so that's 6 times a year, but whatever.
| bionhoward wrote:
| The point is it's sucking your data into some amorphous big
| brother dataset without explicitly asking you if you want that
| to happen first. Opt out AI features are generally rude,
| trashy, low-class, money grubbing data grabs
| pjc50 wrote:
| > what's the use case where you'd want to do this more than 3x
| per year?
|
| That means that all Microsoft has to do to get your consent to
| scan photos is turn the setting on every quarter.
| anigbrowl wrote:
| _Presumably, it 's somewhat expensive to run face recognition
| on all of your photos._
|
| Very likely true, but we shouldn't have to presume. If that's
| their motivation, they should state it clearly up front and
| make it opt-out by default. They can put a (?) callout on the
| UI for design decisions that have external constraints.
| fancyfredbot wrote:
| Seems obvious they actually mean to limit the number of times you
| can opt in. Very poor choice of words.
| mortehu wrote:
| The difference is whether you get locked into having it on or
| having it off at the end.
| surgical_fire wrote:
| There's a great solution to this.
|
| Just stop using Microsoft shit. It's a lot easier than untangling
| yourself from Google.
| bee_rider wrote:
| Yeah it is legitimately hard to avoid Google, if nothing else
| some of your emails will probably be leaked to Gmail.
|
| But Microsoft is pretty easy to avoid after their decade of
| floundering.
| LogicFailsMe wrote:
| Whenever I _have_ to use Windows, I just create a new
| throwaway account on proton, connect it to the mother
| throwaway account connected to a yahoo email account created
| in the before times, install what I need, and then never
| access that account again.
| hshdhdhehd wrote:
| It is fucked you almost need mob levels of burner cell
| precautions to have privacy and use Excel.
| khazhoux wrote:
| How can I play starcraft 2 without it?
| bee_rider wrote:
| Apparently it runs in Proton (I haven't tried it though).
| righthand wrote:
| Starcraft 2 w/ Battlenet has been working on Linux for over
| a decade. You don't even need Proton, it works lovely with
| WINE.
| sandblast wrote:
| Yes. Just use Immich for photos. AI scanning, but local and
| only opt-in.
| fishmicrowaver wrote:
| I was quite happy for a couple years to just use windows and wsl.
| Fully switched to Linux at home and Linux VM's at work. The
| thirst and desperation to make AI work gives me the creeps more
| than usual.
| bayindirh wrote:
| Did anyone notice that Microsoft never replied any of the asked
| questions, but deflected them?
|
| They are exactly where I left them 20 years ago.
|
| It's very sad that I can't stop using them _again_ for doing
| this.
| anigbrowl wrote:
| This is such a norm in society now; PR tactics take priority
| over any notion of accountability, and most journalists and
| publishers act as stenographers, because challenging or even
| characterizing the PR line is treated as an unjustified attack
| and inflated claims of bias.
|
| Just as linking to original documents, court filings etc.
| should be a norm in news reporting, it should also be a norm to
| summarize PR responses (helpful, dismissive, evasive or
| whatever) and link to a summary of the PR text, rather than
| treating it as valid body copy.
| dandellion wrote:
| They take people for idiots. This can work a few times, but
| even someone who isn't the brightest will eventually put two
| and two together when they get screwed again and again and
| again.
| JoshTriplett wrote:
| People need to treat PR like they do AIs. "You utterly failed
| to answer the question, try again and actually answer the
| question I asked this time." I'd love to see corporate
| representatives actually _pressed to answer_. "Did you
| actually do X, yes or no, if you dodge the question I'll
| present you as dodging the question and let people assume the
| worst."
| delfinom wrote:
| It's not just PR tactics for the sake of accountability. It's
| because there's a glut of lawyers that'll sue for the tinest
| admission of anything.
| LunaSea wrote:
| I was afraid for the EU economy, but after this declaration I'm
| reassured that Microsoft will pay for my grand kids' education in
| 30 years.
| moooo99 wrote:
| I think the EU is flawed in more ways that just one. But every
| time I see ,,<AI feature> will be available starting now
| outside EU" I am really grateful
| LogicFailsMe wrote:
| I've never seen a better case for uploading endless AI slop
| photos.
| chris_wot wrote:
| I think a call to Australia's privacy commissioner might be in
| order.
| GeekyBear wrote:
| You can really tell that Microsoft has adopted advertising as a
| major line of business.
|
| The privacy violations they are racking up are very reminiscent
| of prior behavior we've seen from Facebook and Google.
| dreamcompiler wrote:
| And not just advertising. If ICE asks Microsoft to identify
| accounts of people who have uploaded a photo of "Person X", do
| you think they're going to decline?
|
| They'd probably do it happily even without a warrant.
|
| I'd bet Microsoft is doing this more because of threats from
| USG than because of advertising revenue.
| einpoklum wrote:
| ICE don't have to ask for anything, the USG gets a copy of
| all data Microsoft collects from you, anyway. Remember:
|
| https://www.pcmag.com/news/the-10-most-disturbing-snowden-
| re...
| zzgo wrote:
| > They'd probably do it happily even without a warrant
|
| I'm old enough to remember when companies were tripping over
| themselves after 9/11 trying to give the government anything
| they could to help them keep an eye on Americans. They
| eventually learned to monetize this, and now we have the
| surveillance economy.
| smileson2 wrote:
| Microsoft in the past few years has totally lost it's mind, it's
| ruining nearly everything it touches and I can't understand why
| mixmastamyk wrote:
| Money and power. Who was the first BigTech co on the Prism
| slides? Who muscled out competitors in the 90s?
| Spooky23 wrote:
| They never changed. For some reason Satya became CEO and nerds
| fawned over the "new Microsoft" for whatever reason.
|
| They are a hard nosed company focused with precision on
| dominance for themselves.
| uep wrote:
| Do we even think that was real? I think social media has been
| astroturfed for a long time now. If enough people make those
| claims, it starts to feel true even without evidence to
| support it.
|
| Did they ever open source anything that really make you think
| "wow"? The best I could see was them "embracing" Linux, but
| embrace, extend, extinguish was always a core part of their
| strategy.
| exe34 wrote:
| Makes me want to download and install windows, and store a
| picture of my hairy brown nutsack with googly eyes on it.
| correlator wrote:
| Does this mean that when you disable, all labels are deleted, and
| when you turn it back on it has to re-scan all of your photos?
| Could this be a cost-saving measure?
| d-sky wrote:
| In that case, they should make it the other way around -- you
| can enable this only three times a year.
| ok_dad wrote:
| They should do it the other direction, then: if you turn it off
| more than three times you can't turn it back on.
| margalabargala wrote:
| But that's less good for profit. Why would they give up money
| for morals?
| bayindirh wrote:
| Esp. when you can just eat money to survive when you
| relocate to the Mars, no?
| alterom wrote:
| >Does this mean that when you disable, all labels are deleted
|
| AHHAHAHAHAHAHAHAHA.
|
| Ha.
|
| Nice one.
| dandellion wrote:
| No, it's a profit-seeking measure.
| ptrl600 wrote:
| Presumably you just need to turn it off once, right?
| anarticle wrote:
| Year of the Linux desktop edges ever closer.
| anigbrowl wrote:
| Truly bizarre. I'm so glad I detached from Windows a few years
| back, and now when I have to use it or another MS product (eg an
| Xbox) it's such an unpleasant experience, like notification hell
| with access control checks to read the notifications.
|
| The sad thing is that they've _made_ it this way, as opposed to
| Windows being inherently deficient; it used to be a great blend
| of GUI convenience with ready access to advanced functionality
| for those who wanted it, whereas MacOS used to hide technical
| things from a user a bit too much and Linux desktop environments
| felt primitive. Nowadays MS seems to think of its users as if
| they were employees or livestock rather than customers.
| wkat4242 wrote:
| Microsoft is such a scummy company. They always were but they've
| become even worse since they've gone all in on AI.
|
| I wonder if this is also a thing for their EU users. I can think
| of a few laws this violates.
| pessimizer wrote:
| Isn't it cute when there's absolutely no rationale behind a new
| rule, and it's simply an incursion made in order to break down a
| boundary?
|
| Look, scanning with AI is available!
|
| Wow, scanning with AI is now free for everyone!
|
| What? Scanning with AI is now opt-out?
|
| Why would opting-out be made time-limited?
|
| WTF, what's so special about 3x a year? Is it because it's the
| magic number?
|
| Ah, the setting's gone again, I guess I can relax. I guess the
| market wanted this great feature, or else they wouldn't have
| gradually forced it on us. Anyway, you're a weird techie for
| noticing it. What do you have to hide?
| bigbuppo wrote:
| There is a big rationale behind it. If their AI investments
| don't pan out, Microsoft will cease to exist. They've been out
| of ideas since the late 90s. They know that the subscription
| gravy train has already peaked. There is no more growth unless
| they fabricate new problems for which they will then force you
| to pay for the solution to the problem they created for you.
| Oh, your children were kidnapped because Microsoft sold their
| recognition and location data to kidnappers? Well you should
| have paid for Microsoft's identity protection E7 plus add-on
| subscription that prevents them from selling the data you did
| not authorize them to collect to entities that they should know
| better than to deal with.
| nbngeorcjhe wrote:
| I don't even get why they would need "ideas" or "growth" tbh.
| They have most popular desktop operating system, they have
| one of the most popular office suites, surely they make
| plenty of profit from those. If they just focused on making
| their existing products not shit they would remain a
| profitable company indefinitely. But instead they're
| enshittifying everything because they want more More MORE
| Ekaros wrote:
| Because there are too many people chasing of ever going up
| line on valuation chart. It is simply not acceptable
| anymore to have reasonable business that generates solid
| dividends and grows with opening markets and population.
| Blame the silicon valley, VC and like...
| A_D_E_P_T wrote:
| It really seems as though Microsoft has total contempt for their
| retail/individual customers. They do _a lot_ to inconvenience
| those users, and it often seems gratuitous and unnecessary. (As
| it does in this case.)
|
| ...I guess Microsoft believes that they 're making up for it in
| AI and B2B/Cloud service sales? Or that customers are just so
| locked-in that there's genuinely no alternative? I don't believe
| that the latter is true, and it's hard to come back from a badly
| tarnished brand. Won't be long before the average consumer hates
| Microsoft as much as they hate HP (printers).
| bigbuppo wrote:
| This is once again strongly suggesting that Microsoft is
| thoroughly doomed if the money they've dumped into AI doesn't pan
| out. It seems to me that if your company is tied to Microsoft's
| cloud platform, you should probably consider moving away as
| quickly as you can. Paying the vmware tax and moving eveyrthing
| in house is probably a better move at this point.
| syntaxing wrote:
| Growing up, Microsoft dominance felt so strong. 3 decades later,
| there's a really high chance my kids will never own or use a
| windows machine (unless their jobs gives them one).
| einpoklum wrote:
| > I uploaded a photo on my phone to Microsoft's
|
| That's your problem right there.
|
| > Microsoft only lets you opt out of AI photo scanning
|
| Their _UI_ says they let you opt out. I wouldn't bet on that
| actually being the case. At the very least - a copy of your
| photos goes to the US government, and they do whatever they want
| with it.
| gessha wrote:
| This made me look up if you can disable iOS photo scanning and
| you can't. Hmm.
___________________________________________________________________
(page generated 2025-10-11 23:00 UTC)