[HN Gopher] Facebook is researching AI systems that see, hear, r...
       ___________________________________________________________________
        
       Facebook is researching AI systems that see, hear, remember
       everything you do
        
       Author : coldcode
       Score  : 141 points
       Date   : 2021-10-17 12:31 UTC (10 hours ago)
        
 (HTM) web link (www.theverge.com)
 (TXT) w3m dump (www.theverge.com)
        
       | fithisux wrote:
       | ... and don't forgive, unless you buy the "premium" version.
        
       | arpa wrote:
       | No, thank you. But I suspect people would still accept and use
       | that tech just like they have accepted always-online/always-
       | listening digital assistants.
       | 
       | Just imagine the possibilities of large-scale manipulations, tho!
        
         | fctorial wrote:
         | Do you use history in your shell?
        
           | CGamesPlay wrote:
           | I do, but I don't upload it to any other parties.
        
           | danShumway wrote:
           | What exactly is the comparison here?
           | 
           | Remembering my shell history for commands I run locally on a
           | single computer, stored in a format I can control/edit, that
           | is never transmitted to other people or used for advertising,
           | that is fed into a simplistic history algorithm I have
           | complete control over, and that I can toggle on and off at
           | will even for individual commands -- that's the same as a 3rd
           | party analyzing on remote servers everything I _look at_
           | while I 'm walking around the real world?
        
       | jmfldn wrote:
       | When I read articles like this I experience a mini existential
       | crisis that we could be heading to some very dark places
       | technologically.
       | 
       | I don't need to go into the reasons why it's bad. For anyone
       | who's aware of the last 15 years of social media critique or
       | novels like Brave New World, it's obvious that this would be a
       | dystopia multiplier. Ad tech on steroids. If you think the
       | manipulation is bad now just wait until this data is fed into the
       | ML models.
       | 
       | If you must have everything you see, do and say recorded then
       | please, for the love of god, let's use non-profit, open source /
       | free software platforms where we own our own data.
        
         | still_grokking wrote:
         | I'm already looking forward to brain-internet interfaces.
         | 
         | Does FB invest also in those?
         | 
         | Just imagine, all the possibilities!
         | 
         | [Do I need to mark this as sarcasm?]
        
         | 908087 wrote:
         | As a society, we've been sleepwalking into a very dark place
         | technologically for well over a decade now, and those of us who
         | have tried to point that out have been routinely shouted down
         | and called luddites for it.
         | 
         | I fear we're far past the point of no return, and there is no
         | escape from the disaster we've created even for those of us who
         | have avoided participating ourselves.
        
         | radmuzom wrote:
         | In addition to being a "dystopia multiplier", I predict this is
         | going to cause a lot of social problems going forward. This is
         | what the Black Mirror episode "The Entire History of You"
         | warned us about.
        
         | sneak wrote:
         | In one generation in the US we went as a whole society from
         | most private p2p conversations not being logged and monitored
         | and recorded by the government, to most private p2p
         | conversations not only being logged and monitored and recorded,
         | but available to the state at any time without probable cause
         | or a warrant (FISA Amendments Act).
         | 
         | iMessage, WhatsApp, Gmail, Facebook Messenger, Instagram: all
         | of these are stored on the server effectively unencrypted (in
         | the case of iMessage and WhatsApp, via the chat backups, a
         | backdoor to the e2e encryption they promise). The state has
         | access to all of them at any time. To top it off the carriers
         | and a million different apps are constantly logging your
         | location history for all time, for every member of society.
         | 
         | If you think that isn't one of the most useful and powerful
         | databases in the history of mankind, I question your
         | imagination.
         | 
         | The massive damage, which may just be existential, is not
         | apparent to the USA just yet, but it will be eventually.
         | 
         | The balance of power has shifted even more dramatically than
         | just about ever before in history, in any country in history.
         | Alarm bells should be going off left and right but nobody seems
         | to care and it's just business as usual.
        
           | ianlevesque wrote:
           | WhatsApp just fixed that by encrypting its backups too.
           | https://faq.whatsapp.com/general/chats/how-to-turn-on-and-
           | tu...
        
             | _Understated_ wrote:
             | It's Facebook! How can you trust them?
             | 
             | If you honestly believe them then I've got a bridge you
             | might wanna take a look at...
        
             | still_grokking wrote:
             | The key handling is (as always with this big corp "crypto")
             | murky to say the least.
             | 
             | FB effectively still controls the encryption keys.
             | 
             | Coincidentally nobody of the usual suspects complained that
             | "the terrorists are going dark".
             | 
             | Putting this facts together I'm assured the crypto can be
             | broken at will.
        
             | sneak wrote:
             | Is it opt-in? If so that means ~0% of people will be using
             | it, so all your chats will still be backed up non-e2e by
             | the other side of every conversation.
        
           | jmfldn wrote:
           | Agree. We desperately need something like Tim Berners Lee's
           | Solid. A way for us to take control from the centralised data
           | silos owned by big tech. I know its almost mission impossible
           | to turn the tide, but if the alternative is giving in then
           | engineers worried about this must fight for alternatives,
           | however grim the odds.
           | 
           | As for this specific issue, of course there are the usual
           | massive issues around data privacy. The major issue I was
           | alluding to here is ad tech AI processes being fed an
           | unimaginably rich source of intimate data. It's a horrifying
           | prospect.
        
           | 01100011 wrote:
           | > most private p2p conversations not being logged and
           | monitored
           | 
           | FWIW, I think telephone company billing records have been
           | doing this for nearly a century. Sure, it wasn't the official
           | government policy, but the data was recorded. ECHELON has
           | been around quite a long time as well, and that was indeed
           | government monitoring of large amounts of international
           | communications.
           | 
           | I'm not trying to normalize it, just adding some context.
        
             | sneak wrote:
             | I meant content, not metadata.
             | 
             | Facebook Messenger, WhatsApp chat backups, iMessage chat
             | backups (incl all geotagged photo attachments), et c.
        
               | [deleted]
        
               | dredmorbius wrote:
               | Sure.
               | 
               | Though metadata is often more useful than content. And
               | the point is that telco providers, government agencies,
               | and all those who've hacked into them, know of phone
               | calls made from numbers you no longer remember you had.
        
             | dredmorbius wrote:
             | I'm ... "only" ... aware of US call history data being
             | available dating to the 1980s, through the "Hemisphere"
             | programme:
             | 
             | https://www.eff.org/cases/hemisphere
             | 
             | There may well be earlier extant data. It was probably
             | tabulated either on punch-card or magnetic tape, and
             | certainly _was_ used for billing purposes. Prior to the
             | 1980s, large-scale data preservation was quite expensive.
             | 
             | If anyone has information of earlier comprehensive call
             | history data surviving to the present, I'd appreciate being
             | illuminated.
             | 
             | More citations of Hemisphere and call history data in an
             | earlier comment:
             | https://news.ycombinator.com/item?id=22208434
        
           | LuisMondragon wrote:
           | AFAIK the chat backup is optional. I've never stored a
           | backup.
        
             | sneak wrote:
             | Everyone you chat with does (because it's on by default),
             | making your choice irrelevant.
        
         | [deleted]
        
         | decasteve wrote:
         | Social media have become Bradbury's parlour walls.
        
       | mgreb wrote:
       | Wait, are such devices even legal? Looks like walking spy-cam to
       | me. I want to know if someone records video/takes picture of me.
       | With this glasses you can`t tell if you are being filmed.
        
         | blackoil wrote:
         | Is it illegal to shoot videos in public place without
         | permission of all present ? Private places may have their own
         | rules. Also this is R&D, long way before such questions matter.
        
           | jjulius wrote:
           | Nah, these questions matter now.
        
           | picardythird wrote:
           | Don't you think answering these questions should guide the
           | R&D?
        
           | de_keyboard wrote:
           | In most western countries it is legal. I'm not sure it should
           | be, given recent technological advances.
        
         | KaiserPro wrote:
         | Yes, there is a "right to panorama" in most countries. However
         | this dataset has been recorded with the _express permission_ of
         | all the people involved. Which is different to the imagenet,
         | where they just crawled a whole bunch of images and called it a
         | day.
         | 
         | Second, you need to take the "Argh its facebook, boo hiss" hat
         | off. Then apply some critical thinking. an AR world that is
         | connected to the physical _requires_ this kind of thing. Your
         | device need to anticipate what you are going to do so that it
         | can work out the probability of your need and act on it. Like
         | Jeeves, but less able, and more annoying. As its ML, it need a
         | massive dataset to train on. this is <0.5% of that dataset.
         | 
         | Depending on how things are done, if facebook are first to
         | market with a usable AR system, they will be forced to have
         | anonymisation built in (as in remove faces at the sensor level,
         | unless you have permission to remember). Apple will have a
         | showy "pixelation" layer, that is mostly ineffectual, but will
         | PR it out to make them seem like they've cured cancer. Google,
         | if they ever manage to get back into AR will just make things
         | cheap and let the shitty android marketplace spy on what ever
         | they like.
         | 
         | You will also have to remember that the power budget of these
         | glasses is absolutely fucking tiny. All day screen, SLAM, AI
         | and possibly music will need to all fit in to ~2-5watt hours.
         | This means that virtually everything will need to be on device.
         | (dropping to the cloud eats a boat load of power.)
         | 
         | Now that's not to say that 100% accurate [if thats even
         | possible] "diarisation" won't rip society apart.
         | 
         | There are lots of questions that need to be answered, the
         | problem is, tech journalists are ill equipped to ask them. Most
         | of the teams designing the glasses are well meaning, but their
         | targets and life experience has not equipped them to do a good
         | job at ethics.
        
       | TeeMassive wrote:
       | > Episodic memory: What happened when (e.g., "Where did I leave
       | my keys?")?
       | 
       | > Forecasting: What am I likely to do next (e.g., "Wait, you've
       | already added salt to this recipe")?
       | 
       | > Hand and object manipulation: What am I doing (e.g., "Teach me
       | how to play the drums")?
       | 
       | > Audio-visual diarization: Who said what when (e.g., "What was
       | the main topic during class?")?
       | 
       | > Social interaction: Who is interacting with whom (e.g., "Help
       | me better hear the person talking to me at this noisy
       | restaurant")?
       | 
       | Now consider the fact that Facebook is actively suppressing some
       | subjects from popping up in their users' feed. If users becomes
       | reliant on this AI to initiate, guide and execute most day to day
       | tasks then Facebook will literally have the power to make certain
       | aspects of their users' life disappear.
       | 
       | For example, you have a contact who criticizes Facebook or their
       | business partners too much? Don't remind them of their birthday.
        
       | dylan604 wrote:
       | It just seems like the people at FB have watched Black Mirror and
       | decided they want to do all the things they saw.
        
         | shimonabi wrote:
         | That "social credit" episode was really creepy.
        
           | [deleted]
        
       | hetspookjee wrote:
       | Sometimes I dream of a Facebook machine; e.g. the eerily creepy
       | recommender for your next best action, but than with an objective
       | function not to optimise the recommenders revenue (by being as
       | effective ad seller as possible) but rather changing the
       | objective function to optimise for health, wellbeing and
       | happiness. Now surely it is far from trivial to define these, but
       | atleast you can make a start somewhere:
       | 
       | - You could factor in the cost for environment per each
       | recommendation and have that drive down the score of a
       | recommendation: I think the "fast-fashion" idiotic anti-
       | environmental trends would die out fast than.
       | 
       | - You could factor in, based on some overarching theme of ethos,
       | that each person needs to eat healthy and recommend the right
       | healhty food at the right time; instead of recommending the next
       | best McDonalds burger to an already obese person.
       | 
       | - Take into consideration, based on studies, what actual heatlhy
       | behaviour looks like, and try to fit that to a person - with
       | respect to their actual persona and state of health (which are
       | obviously known already given that each bit of you is pretty much
       | on the street already)
       | 
       | I could go on, but in truth, a personal recommender that would
       | know me better than I even would, and would optimise towards
       | happiness and greatness, instead of # shit sold like it currently
       | is, would already be a tremendous improvement. Surely, this idea
       | is also far from original, but I hope that we someday get to such
       | a thing. Atleast a thing that stops recommending environmentally
       | destructive behaviours like buying so much stuff I don't really
       | need.
        
         | istorical wrote:
         | I find it absolutely painful that our recommender ML based
         | content discovery streams like YouTube and Tiktok and Instagram
         | don't have multiple 'hats' you can put on and take off, in the
         | sense of saying 'hey TikTok, I'm feeling frisky, feel free to
         | send me those thirst trap videos now', but then 30 minutes
         | later you can say 'OK TikTok, I know I watched a bunch of those
         | dancing videos, but can you send me the educational stuff now'.
         | Ditto IG, YouTube etc. Like 'hey YouTube, engage education
         | mode', 'hey YouTube, engage self-improvement mode', 'hey
         | YouTube, I just wanna relax and laugh'.
         | 
         | I know that there are categories, even auto-generated
         | categories on YouTube, but they aren't that great or all
         | encompassing. My biggest critique of present personalized ML
         | content streams is that there's no two-way communication. I
         | can't tell IG that yes I wanna see big booties right now, but
         | no I don't want to see them five hours from now. Or I want to
         | see aquaponics right now since I'm on a learning binge, but
         | later tonight I don't need more technology and science, I just
         | need to laugh and unwind.
         | 
         | Our actions, our content consumption, is not allowed to be one
         | and done. Every move you make is seen as a signal to the
         | algorithm that you want more of that all the time. The closest
         | thing we have is incognito mode, or maintaining multiple
         | accounts for different purposes. But why can't you look at an
         | ex on instagram one time without them showing you that person
         | everytime you open the app for 5 years. It's really toxic and
         | unsustainable that all of your actions are seen as 'yes, please
         | more of this all the time'.
        
           | jorpal wrote:
           | Anyone remember StumbleUpon? You could easily edit your
           | interests and get different types of content. It was the
           | pinnacle.
        
           | clairity wrote:
           | > "...in the sense of saying 'hey TikTok, I'm feeling frisky,
           | feel free to send me those thirst trap videos now'..."
           | 
           | i once pitched this type of user experience (for the general
           | case, not tiktok, which didn't exist at the time) at a
           | hackathon competition, and it was popular enough then to win
           | us some free stuff. the demo we hacked together was very
           | rudimentary, but had we gotten a little further along with a
           | proof of concept before falling apart, who knows what might
           | have happened...
           | 
           | the biggest challenge was actually licensing costs and rights
           | management, not the recommendation engine (though that was
           | challenging too).
        
         | mertd wrote:
         | > changing the objective function to optimise for health,
         | wellbeing and happiness.
         | 
         | Whose? Individual or group wellbeing? Optimizing for one can
         | lead to very suboptimal outcomes for the other.
        
           | tomjen3 wrote:
           | This feels like the argument about weather the self-driving
           | car should hit two grandmas or the guy in the car. The
           | discussion is not productive, because the answer that saves
           | the most people is to get self-driving cars on the streets as
           | soon as possible.
           | 
           | As for the answer to your question, just go for best
           | wellbeing for the individual. Happier, healthier, less obese,
           | fitter, non-smoking people are also a net benefit to society.
           | 
           | Assuming the technology works at all.
        
         | SamoyedFurFluff wrote:
         | Unfortunately I think we live in a society where the people who
         | could use this the most are the people least able to afford the
         | costs of such a service.
        
         | [deleted]
        
         | dado3212 wrote:
         | FB has gotten roasted for doing emotional manipulation in the
         | past: https://www.forbes.com/sites/kashmirhill/2014/06/28/faceb
         | ook.... Not confident that the company trying to specifically
         | target emotions (even ostensibly happiness) would be taken that
         | well.
        
           | Frost1x wrote:
           | I'm not sure it would matter. The general population that
           | compose Facebooks revenue stream don't seem to care. We keep
           | hinting at these ethical conundrums that invisibly regulate
           | massive businesses when they simply don't. It's not just
           | tech, it's big businesses in general. The ethics only matter
           | if a large enough population of the consumer and product base
           | both care and act. Only demand matters and if demand doesn't
           | care or you can manipulate the demand not to care, you can
           | get away with about anything that isn't explicitly illegal.
           | 
           | I suspected Facebook does the work you referenced but I had
           | no idea they actually did and had any flack about it. I'd
           | like to think I'm a fairly tech savvy and an informed
           | consumer (especially around issues like this), maybe I'm not,
           | but if I am, what hope do you have of the general population
           | caring when they aren't even aware as a first hurdle before
           | the other massive hurdles of getting them to care enough to
           | act.
           | 
           | The fact is that this type of work isn't regulated. Funding
           | agencies regulate human subject work for your typical public
           | funded research work and require a lot of informed consent.
           | The leverage they use is that should you violate these terms,
           | you may have current and future funding pulled, may be black
           | listed across multiple agencies, and may find it difficult to
           | pursue any career in research in the future involving human
           | subjects. There are lots of disincentives not do this on the
           | premise of manipulating your resources.
           | 
           | Meanwhile, entities like Facebook have piles of such
           | resources and have a vested interest in all sorts of this
           | questionable work. Pretty much nothing prevents them from
           | doing it since they're self funding. Unless you can remove
           | their self fundability, it will continue. This goes back to
           | problem of needing consumers to vote with their wallet in an
           | effective manner.
           | 
           | The other option is to create the explicit policy protections
           | in law to regulate these activities but as a society, capital
           | has convinced everyone that all regulation is evil and will
           | only hinder consumer/citizen progress. You need not only
           | policy but real enforcement teeth as well that create
           | disincentives that are catastrophic enough not to tempt risk
           | for the potential gains.
        
           | hetspookjee wrote:
           | I do not believe Facebook as a company would be possible in
           | generating such a thing. And if they could I can only be
           | cynical about it. I rather put my faith in the open source
           | community where people tinker on their own algorithms, and
           | provide their own data. I am sure the first initiatives are
           | already on their way regarding personal recommenders. But I
           | don't know any. If anyone does, please recommend :)
        
         | rumblerock wrote:
         | I'm certainly wish this were possible, and people who left
         | would probably go back to the platform if it reformed itself in
         | this direction. But as the years go on all I see are more and
         | more catastrophic impacts from short term thinking and near
         | term profit generation, whether it's Facebook, climate, VC-
         | funded startups or the current supply chain crisis.
         | 
         | Perhaps the public consciousness around the unhealthy design of
         | existing social media is approaching a point where a new
         | Facebook-like social network could actually achieve a critical
         | mass of users. On the order of years I think it'll happen (like
         | rates of smoking cigarettes), but how long that will be who
         | knows. I can only hope it comes in the next few years as I fear
         | the destructive ripple effects through culture and society are
         | only getting worse.
        
       | moogly wrote:
       | WW3 might be against Facebook.
        
       | kwertyoowiyop wrote:
       | All I need is to see people's names right above their heads.
        
         | jstx1 wrote:
         | It should only cost you half of your remaining life span.
        
         | [deleted]
        
       | pesenti wrote:
       | I support Facebook AI. The actual news is that we are releasing
       | an extensive egocentric dataset with associated tasks for the
       | research community. Collecting such dataset in a private and
       | responsible way is a significant challenge. Happy to answer any
       | questions.
       | 
       | https://ai.facebook.com/blog/teaching-ai-to-perceive-the-wor...
        
         | amelius wrote:
         | Glad for you that you collected such a nice dataset.
         | 
         | Wouldn't it be nice if selected external researchers too could
         | collect data on the Facebook platform with the explicit consent
         | of users?
        
           | pesenti wrote:
           | Here is a link to all the tools we provide:
           | https://research.fb.com/data/
        
         | knownjorbist wrote:
         | Maximizing a manipulation engine is not what we as a species
         | need to be investing any time or effort into.
        
         | Hnrobert42 wrote:
         | What are you all doing to allow those of us that don't want to
         | be a part of your utopia to opt out?
        
           | pesenti wrote:
           | I encourage you to read the details of the news, in
           | particular how the data was collected with full consent from
           | all (something that most research datasets don't do well). It
           | doesn't answer your question directly but gives you an idea
           | on how we think about these issues. AI-powered wearables will
           | have stringent privacy guarantees, both for the wearer and
           | the people around.
        
             | thr0wawayf00 wrote:
             | HN user asks a simple and direct question to Facebook's VP
             | of AI:
             | 
             | > What are you all doing to allow those of us that don't
             | want to be a part of your utopia to opt out?
             | 
             | Answer from Facebook VP of AI:
             | 
             | > I encourage you to read the details of the news that
             | doesn't actually answer your question but gives you an idea
             | of how we think
             | 
             | I'm glad to see that even on a industry-specific forum like
             | Hacker News, we can't get straight answers from people like
             | this. Giving people "an idea of how we think" seems like a
             | great way to stay vague enough that your thinking can
             | change however it needs to over time to fulfill your
             | business objectives, health or safety of the community be
             | damned. Facebook doesn't need to give us vague "ideas", it
             | needs to provide straightforward answers to straightforward
             | questions.
             | 
             | > AI-powered wearables will have stringent privacy
             | guarantees, both for the wearer and the people around.
             | 
             | Reading a sentence like this from a Facebook exec
             | completely stretches credulity for me. I have zero
             | confidence that your definition of "stringent privacy
             | guarantees" is anything close to something that actually
             | benefits the community as a whole.
        
               | pine390 wrote:
               | This reminds me of the squid game when they say "You
               | signed a disclaimer of physical rights today, didn't
               | you?"
        
               | dang wrote:
               | Please don't attack people like this when they post to
               | HN, regardless of how you feel about their employer. You
               | may not owe $BigCo better, but you owe this community
               | better if you're participating in it. When something
               | remains unanswered, it's enough to ask substantive
               | questions respectfully.
               | 
               | We don't want HN to be a hostile place that tars-and-
               | feathers people who show up to explain another side of a
               | story--especially not on a topic they know about. For
               | most of us, our work is what we know the most about.
               | Therefore, people showing up to discuss something related
               | to their work are among the highest-value contributors HN
               | can have. We don't want to discentivize that, and
               | attacking them for it goes directly against the mandate
               | of the site. You don't have to agree, obviously, but you
               | do have to stick to the site guidelines. (Btw, that also
               | means that you shouldn't be posting generic-indignant
               | rants here.)
               | 
               | https://news.ycombinator.com/newsguidelines.html
               | 
               | https://hn.algolia.com/?query=disincent%20by:dang&dateRan
               | ge=...
               | 
               | Edit: I just noticed that you got much nicer later in the
               | thread. That's way better--thanks.
        
               | still_grokking wrote:
               | > "gives you an idea of how we think"
               | 
               | It's the usual trick to be able to say afterwards: "I
               | didn't say _that_! You just made up some interpretation.
               | _I 'm_ not responsible for _your interpretation_. "
        
               | saiya-jin wrote:
               | I mean, what do you expect. This is public forum, he
               | comes honestly unanonymously in front of huge crowd
               | already in _very_ negative mood towards this. You can 't
               | be a VP of such a thing in one of the richest companies
               | globally ever, and not know how to play politics and
               | lawyering safely.
               | 
               | pesenti - I have respect that you come forward like this
               | and attempting to push mankind's boundaries.
               | Unfortunately, FB is probably the second worst company
               | ever (first could be Palantir) to take on this. I don't
               | trust them a attometer, and never will. Any attempt to
               | change this attitude always ends badly, see ie Carmack.
               | 
               | Very mixed feelings is probably the best description I
               | can give.
        
               | thr0wawayf00 wrote:
               | I expect that those who have power should bear the
               | responsibility of being asked hard questions and
               | answering them honestly, because a society which
               | discourages us from asking difficult questions of those
               | in power is doomed to fail.
        
               | pesenti wrote:
               | The question was about opting out of our "utopia", I
               | wouldn't call that a "straightforward question". If you
               | do have a straightforward question, especially about this
               | or another specific project, I can try to provide a
               | straightforward answer.
        
               | thr0wawayf00 wrote:
               | Great! How do I make sure that nothing about me ever
               | passes through a Facebook server/ETL job/datalake/neural
               | net/whatever? Can that even be guaranteed? Is Facebook
               | generating a shadow profile on me or using data I've
               | generated either myself or through friends that have
               | taken pictures/videos/etc of me, and if maybe, how do I
               | opt out?
               | 
               | Facebook just released Ray Ban smart glasses that don't
               | look any different from standard Ray Bans and don't
               | provide a solid design affordance to communicate to
               | others when they're active, so I don't really get the
               | feeling that Facebook cares at all about who is being
               | swept up in their systems.
        
               | pesenti wrote:
               | No, that's not a reasonable assumption. Photos or videos
               | of you are likely on Facebook's servers, uploaded by
               | friends or others when you are in public.
               | 
               | What I can assure you though is that if you are not a
               | user, we are not recognizing you in these pictures and
               | videos. We won't do that without explicit consent from
               | users.
        
               | thr0wawayf00 wrote:
               | Thanks Jerome, I really do appreciate your answer and I
               | know I'm probably not the most fun person to talk to.
               | 
               | I do have a follow-up on this if you'll entertain me: why
               | is it that Facebook stores content of people that it
               | doesn't recognize?
               | 
               | Granted, I'm but a lowly web developer, but it seems like
               | creating business logic that automatically removes
               | content with people that aren't Facebook users would be
               | pretty straightforward to implement. You've already
               | solved the hardest part of that problem, the facial
               | recognition, so why not go all the way?
               | 
               | Moving forward, you'd have a really simple approach to
               | privacy that's transparent and people understand without
               | needing to get into the weeds.
               | 
               | Receiving assurance that I'm not being recognized in
               | photos and videos isn't very comforting when I see
               | Facebook releasing products like the "smart" Ray Bans.
               | Recognizing people in images is only one of many types of
               | data that Facebook gleans from that content, and I don't
               | want anything involving me being processed in any way by
               | that company, whatsoever.
        
               | still_grokking wrote:
               | > Recognizing people in images is only one of many types
               | of data that Facebook gleans from that content, and I
               | don't want anything involving me being processed in any
               | way by that company, whatsoever.
               | 
               | Which is a basic right under GDPR.
               | 
               | No private entity is allowed to store or process data
               | about you for their purposes without your consent.
               | 
               | Did FB just said they are doing it though?
        
               | still_grokking wrote:
               | What does "we are not recognizing you" mean? Of course
               | you can't "recognize" someone you don't know, so what do
               | you try to say here?
               | 
               | Do you gather data about those "unknown" persons.
               | 
               | Do you try to match information about "unknown" persons
               | form different sources?
        
               | wonderwonder wrote:
               | Thanks for answering questions. In regards to this. If I
               | have a Facebook account, and then close it, is my
               | existing data in Facebook's systems erased including
               | image recognition / tracking based preferences, etc?
        
             | jjulius wrote:
             | >I encourage you to read the details of the news, in
             | particular how the data was collected with full consent
             | from all (something that most research datasets don't do
             | well).
             | 
             | I don't care to hear about the consent you received for
             | this controlled study. I care to hear about how I/we can
             | opt-out from something like this when it is eventually
             | deployed in the wild en masse as part of an actual product.
             | I'm pretty sure that's what OP was also asking about. Can
             | you elaborate on that?
        
               | pesenti wrote:
               | It's hard to elaborate on something that's not developed
               | yet.
               | 
               | Here are the principles we are using when developing
               | these products:
               | 
               | https://about.facebook.com/realitylabs/responsible-
               | innovatio...
               | 
               | In practice, it will mean that some things you will know
               | about, but don't necessarily give consent (i.e., someone
               | taking a picture or a video of you), while other will
               | likely require consent (i.e., recognizing you in these
               | pictures/videos).
        
               | still_grokking wrote:
               | > In practice, it will mean that some things you will
               | know about, but don't necessarily give consent (i.e.,
               | someone taking a picture or a video of you)
               | 
               | This would be outright illegal in the EU.
               | 
               | Even someone is taking photos / videos of me this person
               | is not allowed to share those with third parties without
               | my consent.
               | 
               | If this content is going straight to FB this wouldn't be
               | legal in the first place.
               | 
               | Of course FB is fine with that as they just shift the
               | responsibility for the illegal actions onto the person
               | who is uploading things without consent. (Same "trick" as
               | with phone contacts upload).
               | 
               | FB is using a legal loophole here. Nobody will sue his /
               | her friends. And even if someone tried, it's after the
               | fact anyway: FB can't be forced to "unlearn" the gathered
               | information.
        
             | mdoms wrote:
             | Do you think that most people believe Facebook privacy
             | guarantees are worth the bits they're written in?
        
               | pesenti wrote:
               | No but they should:
               | 
               | https://www.ftc.gov/news-events/press-
               | releases/2019/07/ftc-i...
        
               | prox wrote:
               | Facebook got a penalty, but the main gripe is probably
               | that it shouldn't have been necessary to give one in the
               | first place. I think what is on everyones mind is : "how
               | can we trust a company that has a bad track record?"
               | 
               | I think in the coming years FB should invest more in a
               | public conversation on these issues. How can we meet the
               | future in a way that our data is in factuality is our
               | data, even though it's stored by a third party such as
               | yourself.
               | 
               | Especially in light of in the FB model, where the user
               | seems to be the product, and advertisers the client.
        
               | pesenti wrote:
               | My point in linking to the settlement wasn't about the
               | fine, it was about the guarantees, oversight, and
               | accountability that came with it. Read the FTC news
               | release, and you'll see that these are pretty extensive.
               | 
               | And I agree completely on the public conversation, this
               | is why we released this dataset and why we are doing
               | project Aria.
        
               | prox wrote:
               | Understood, and I reread the article just in case. My
               | point is/was that it must come as a natural impulse to be
               | privacy sensitive. Privacy hopefully becomes part of the
               | culture / dna of Facebook, not just because the FCC and
               | the privacy board are looking over your shoulders, so to
               | say :)
               | 
               | How do we prevent becoming "walkable surveillance
               | machines"? How can we control the scope and and the issue
               | of consent? These are more rhetorical questions, but
               | hopefully it is part of the dialogue internally at FB.
        
               | pesenti wrote:
               | These are great points. And yes, we'll only get there if
               | it becomes part of the culture, which this is trying to
               | communicate:
               | 
               | https://about.facebook.com/realitylabs/responsible-
               | innovatio...
               | 
               | but I don't expect you to take it at face value yet, we
               | have a long way to go.
        
             | picardythird wrote:
             | One of the features outlined in the article for this post
             | was for the glasses to remember things that other people
             | said. How would such other people who don't want to be
             | recorded for these purposes opt out?
        
               | KaiserPro wrote:
               | This is a static dataset. You'll want to ask that
               | question about "Project Aria".
               | 
               | for example, can FB provide a list of recording locations
               | and times, so I can request my image/audio removed?
               | 
               | How good is facebook's anonymisation system? what % of
               | faces can be removed (on average)
               | 
               | Has every single piece of footage captured with Aria been
               | anonymised?
        
               | pesenti wrote:
               | From https://about.facebook.com/realitylabs/projectaria/
               | 
               | "Participants will only record in either Facebook offices
               | (once they reopen), wearers' private homes (with consent
               | from all members of the household), or public spaces, and
               | won't record in private venues without written consent
               | from such places. Before any information gathered in a
               | public place is made available to our researchers, it
               | will be automatically scrubbed to blur faces and vehicle
               | license plates."
               | 
               | So we anonymize all the content collected in public. I
               | don't have the stats for the face blurring algorithm, and
               | while you can ask for your data to be removed, we don't
               | provide locations/times. These are good suggestions
               | though.
        
               | still_grokking wrote:
               | > So we anonymize all the content collected in public. I
               | don't have the stats for the face blurring algorithm
               | [...]
               | 
               | I see some contradiction here.
               | 
               | You don't know "the stats for the face blurring
               | algorithm" but you're saying you "anonymize all the
               | content collected"?
               | 
               | If the stats don't say it's 100% (which is impossible
               | afaik if done by machines) you obviously don't anonymize
               | upfront all the content collected.
        
               | pesenti wrote:
               | That's a good question and I don't think we have all the
               | answers yet. But I expect that many functionalities will
               | need to request and get consent from 3rd parties.
        
               | still_grokking wrote:
               | If you can't answer this particular question after you've
               | already advertised that feature there are only two
               | possibilities to explain this knowledge gap:
               | 
               | 1. You don't think upfront about the consequences your
               | products have for people's privacy.
               | 
               | 2. You don't care about the consequences your products
               | have for people's privacy, and leave the particular
               | details for the lawyers of how such products could still
               | be distributed legally.
               | 
               | Which of this explanations should we prefer?
        
               | pesenti wrote:
               | Which feature are you talking about? This is a research
               | project, not a product. The point of research projects is
               | to investigate and figure out answers we don't have
               | today.
        
               | still_grokking wrote:
               | > One of the features outlined in the article for this
               | post was for the glasses to remember things that other
               | people said.
               | 
               | https://news.ycombinator.com/item?id=28898645
               | 
               | So just to be sure: Is this feature one of the goals of
               | this research, yes or no?
               | 
               | From the same post:
               | 
               | > How would such other people who don't want to be
               | recorded for these purposes opt out?
               | 
               | To make the obvious very explicit: The previous question
               | (which you just praised as a "good question") is about
               | this feature. Wasn't this clear to you until now? I'm
               | wondering. This is a simple conversation thread not hard
               | to follow.
        
             | rapnie wrote:
             | > I support Facebook AI.
             | 
             | As VP that doesn't suprise me. The problem is trust, and it
             | is continuously eroding further and further the more we get
             | to know how your company operates.
        
             | dabbledash wrote:
             | >> It doesn't answer your question directly
             | 
             | Then how about you try again and answer it directly this
             | time?
        
         | dannykwells wrote:
         | This comment needs a serious disclaimer that you are a VP of AI
         | at Facebook. Its implied in the above, but such serious COI
         | requires acknowledgement.
        
         | nbzso wrote:
         | I support free and open web. We are in defining point in time
         | in which a possible tech dystopia is upon us. No paycheck or
         | "advancements" will remove the dangers and damages done by
         | greed and machinations of few to Big to fail companies. No
         | dataset or "business" existence is rationalization for creating
         | a system of data-hoaridng and data exploitation.
        
         | dredmorbius wrote:
         | What is your price?
         | 
         | Who knows it with greater accuracy, you, or FB?
        
         | [deleted]
        
       | blunte wrote:
       | Regardless of who provides this, and even irrespective of
       | marketing concerns, there most likely will be data leaks/theft.
       | Imaging having audio+video recordings of what you see and what
       | you say in the hands of the wrong people. And "the wrong people"
       | could be a wide array of different parties, from banks to police
       | to political enemies to nosy neighbors and beyond.
       | 
       | Granted, many of us already collect and give away significant
       | amounts of very personal data, some of which regularly gets
       | leaked or stolen; but first person video and audio recordings?...
       | scary.
       | 
       | And then with Facebook... I trust Zuckerberg with my life data as
       | much as I trust his choice in hairstylists.
        
       | [deleted]
        
       | goldenkey wrote:
       | It's so strange how everytime I get a Facebook link from someone,
       | it requires me to login to read more than 1/4 of it ( and it's
       | cut off just at the right place to entice ) and then before I
       | know it, I'm looking at a bunch of silly unread notifications and
       | terribleness in my feed. It's strange, like a realizing I was
       | hypnotized for a second. I immediately log out and close the
       | window as I snap out of it. Facebook is truly...a cancer. Do not
       | give me the whole song and dance about how their targeted ads
       | help small businesses. 99% of the stuff I've seen on FB,
       | business-wise, is ads for shady products, scams or schemes.
       | 
       | They took a truly golden opportunity to connect people and
       | ravaged it, then turned to acquiring every possible competitor
       | they could, as their main product turned into hot garbage.
       | 
       | Even AOL garners more of my respect than FB, how is this even
       | possible?! FB needs to die like AOL died. It'll only take the
       | boomer generation to die out. Unfortunately, Instagram is still a
       | decent property, probably because they didn't let Zuckerfuck fuck
       | it up. Expect Instagram to become cancer too once FB is no longer
       | the main profit center.
        
       | achenet wrote:
       | This reminds me of the Ted Chiang story The Truth of Fact, the
       | Truth of Feeling.
        
       | throwawaymanbot wrote:
       | Of course they are... I wonder if they are doing this already and
       | are trying to find a way to justify it by saying.. ohhh it was
       | the AI... not us...
        
       | fergie wrote:
       | There is no way that this will ever be legal in the EU.
        
       | h2odragon wrote:
       | Apparently whoever paid for the most recent "Facebook is Bad"
       | media hate package, only bought the 3 day ticket. Cheap bastards.
        
         | knownjorbist wrote:
         | Facebook is consciously aware of the fact that they're
         | optimizing a manipulation engine that has outsized net-negative
         | effects on societies worldwide.
        
         | lucasverra wrote:
         | Who would benefit from that service? Maybe PR services
         | companies? "See what is happening to Fb, we can help you avoid
         | that Mr other dodgy Tech"
         | 
         | Who else?
        
           | blackoil wrote:
           | Traditional Media. Politicians. Competing Social Networks.
           | 
           | Also other tech. companies as FB is sucking up all
           | media/political/people's energy and the rest can cruise
           | current wave unharmed.
        
             | faeyanpiraat wrote:
             | What social networks are you referring to?
        
               | pessimizer wrote:
               | Not worth asking. It's part of a bizarre theory that Big
               | Media is not only a single cabal, but that cabal is
               | constantly whiteboarding attacks on the tech companies
               | that have bought large pieces of it.
               | 
               | I think it's for two reasons: 1) it's scarier to think
               | that Big Media and Big Tech are getting along swimmingly
               | despite a few minor conflicts and necessary keyfabe for
               | the plebs, and 2) people who do horrible things for money
               | in Big Tech want to feel better about themselves, and
               | want people to think well of them (not like they think of
               | people who worked at AIG, Arthur Andersen, or
               | Countrywide.)
        
       | sabujp wrote:
       | if you use fb, unfollow everything and everyone and then either
       | selectively follow what you like or don't ever follow anyone
       | again :
       | https://gist.github.com/renestalder/c5b77635bfbec8f94d28#gis... ,
       | it will greatly reduce your time on fb.
        
       | sys_64738 wrote:
       | I hope it remembers why I deactivated my FB account.
        
         | mlok wrote:
         | When you deactivate your FB account they keep all your datas. I
         | hope you know that. Deleting your account should delete your
         | datas, though. (But I don't 100% trust they do. And maybe some
         | national agencies keep a copy)
        
         | dataviz1000 wrote:
         | So you are saying that you hope it remembers why you went to
         | https://www.facebook.com/deactivate where you deactivated your
         | account without deleting your profile?
        
         | [deleted]
        
         | [deleted]
        
         | Gys wrote:
         | You will be replaced internally by the same AI impersonating
         | you. To keep the ads running.
        
         | hungryforcodes wrote:
         | I wish you so many upvotes for this comment.
        
           | throwinawaysoon wrote:
           | upvotes: thoughts and prayers of the internet
        
             | tomjen3 wrote:
             | Maybe, but they make people feel a little better and cause
             | no lasting harm in small doses, so why not?
        
             | hungryforcodes wrote:
             | Amen.
        
             | vstm wrote:
             | Maybe Facebook can create an AI that also sends thoughts
             | and prayers to whoever needs it. It's also probably easier
             | because most people remember why they need the thoughts and
             | prayers.
        
               | klyrs wrote:
               | Only God can read the inputs to /dev/null
               | yes "Dear $diety please protect this machine and save
               | this lowly process from the OOM killer" > /dev/null
               | 
               | You don't need facebook for that.
        
       | ToddWBurgess wrote:
       | And this is why you need ad blockers and privacy protection
       | plugins for your web browser. If you can't stop Facebook snooping
       | on you at least make it hard for them. As my grandfather taught
       | me, "if you can't win then make it hard for the other guy to
       | win."
        
         | kelseyfrog wrote:
         | I don't like this pattern in calls to action. The pattern
         | starts with an implicit hopelessness of affecting any external
         | change and results in all responsibility being assumed by the
         | individual.
         | 
         | This is an antipattern of change. It does not need to be
         | perpetuated because it reproduces the permanence of the
         | constructs we want to dismantle. Even more importantly, these
         | entities are fully aware of this and craftily guide the masses
         | toward individual responsibility as a default, knowing that it
         | won't affect real change. The antidote to this is critical
         | collective action which realizes the impossibly of change as
         | itself an impossible stance to take.
        
           | pessimizer wrote:
           | Change isn't caused by strength of belief; it's _magic_ that
           | only works based on how hard you believe in it.
           | 
           | Collective action doesn't do itself, and the words
           | "collective action" don't constitute a plan. Individual
           | evasions at least make things more expensive to do.
           | 
           | Also, individual action doesn't crowd out collective action.
           | Not protecting yourself doesn't create collective action. A
           | concrete plan with concrete individual actions to take
           | creates collective action. Individual sacrifice to create
           | institutions creates collective action. IMO you should be
           | ready to tell people where to show up before you tell them
           | not to help themselves.
        
       | jensensbutton wrote:
       | Anyone mad at the universities that actually collected this data?
       | Or just Facebook?
        
       ___________________________________________________________________
       (page generated 2021-10-17 23:01 UTC)