[HN Gopher] AdObserver Blocked by Facebook
       ___________________________________________________________________
        
       AdObserver Blocked by Facebook
        
       Author : vinnyglennon
       Score  : 176 points
       Date   : 2021-08-04 11:50 UTC (11 hours ago)
        
 (HTM) web link (twitter.com)
 (TXT) w3m dump (twitter.com)
        
       | ceejayoz wrote:
       | Some really good journalism came out of this project.
       | 
       | https://themarkup.org/news/2021/04/13/how-facebooks-ad-syste...
       | 
       | Not surprising Facebook wanted to kneecap them. Them claiming
       | _privacy_ concerns is just... peak hypocrisy.
        
         | anigbrowl wrote:
         | That's an excellent summary of how selective ad targeting does
         | its part to drive political polarization.
         | 
         | Paradoxically, people being more informed about different
         | messages can also drive polarization. To recycle the article's
         | example of Exxon telling liberals 'we're becoming more green'
         | and conservatives 'hands off our pipelines' (my paraphrases),
         | if a liberal sees the conservative-targeted message they will
         | likely think 'Exxon is greenwashing its dirty business' while
         | in the reverse case a conservative may think 'woke virtue
         | signalling at its worst.'
         | 
         | Every ad campaign is like a bet on on attention that could pay
         | off but could also backfire if it's too far adrift from the
         | truth or public perceptions/attitudes about your firm/sector
         | (eg if you're a serious company like an accountancy firm, don't
         | try to do funny ads because finance people don't want to hire
         | clowns). Running campaigns where the audience is systematically
         | split and segmented mitigates those risks to some extent, but
         | also runs the risk of eroding the public trust completely once
         | there's a broad awareness that all corporate communications are
         | targeted.
         | 
         | To be honest, I'm somewhat surprised we haven't yet seen
         | corporations put out different press releases to different
         | parts of the media catering to different political
         | orientations. You could probably send different press releases
         | to, say, Newsmax and the Huffington Post and it would be a
         | while before readers of one became aware of the different
         | source material in the other.
        
           | hardtke wrote:
           | I think it is more true that political polarization is used
           | to make ad targeting more effective. Almost nobody uses
           | Facebook to drive opinion or do "branding." Instead, they let
           | Facebook target some campaign to the audience that maximizes
           | ROI. All of the political Facebook ads you saw last year were
           | fundraising ads, and the small net yield (revenue generated
           | minus Facebook cut) was used on other platforms to pay for
           | ads attempting to sway the few percentage of the electorate
           | that was persuadable. Similarly, Exxon is paying for some
           | sort of action on the part of the targeted consumer, not
           | trying to change minds.
        
         | eganist wrote:
         | > Them claiming privacy concerns is just... peak hypocrisy.
         | 
         | They're technically not wrong. They're just prioritizing client
         | privacy, not product/cattle privacy.
        
         | kmeisthax wrote:
         | It's not transparency, it's "self-compromise" - some Facebook
         | marketing engineer, probably
        
       | marketingtech wrote:
       | University researchers get Facebook users to install a plugin
       | that exfiltrates their data and their friends' data from the
       | platform.
       | 
       | Besides leveraging a browser plugin for scraping rather than FB's
       | official APIs, how is this different from Cambridge Analytica?
       | And didn't most people want Facebook to fight back harder against
       | that behavior?
        
         | waterproof wrote:
         | The difference is that Cambridge Analytica was over a decade
         | ago. Facebook now has policies that they enforce against this
         | stuff. CA wouldn't be allowed today either.
        
           | oxguy3 wrote:
           | A decade? It was in 2018...
        
         | weare138 wrote:
         | Username checks out.
        
         | disgruntledphd2 wrote:
         | There is no difference.
         | 
         | Just like when the Obama campaign got literally every FB users
         | in the US's data, it was not materially different from
         | Cambridge Analytica (the deception was lacking, but other than
         | that it was the same data, except more of it).
        
           | meowface wrote:
           | "The deception was lacking" is not only a material
           | difference, but a key difference, assuming this Chicago
           | Tribune article is accurate:
           | 
           | >"However, as former Obama advisers point out, there are
           | significant differences between the way Obama's campaign
           | mined data from Facebook, and the activities of which
           | Cambridge is accused: The Obama campaign collected data with
           | its own campaign app, complied with Facebook's terms of
           | service and, most important in my view, received permission
           | from users before using the data."
           | 
           | https://www.chicagotribune.com/columns/clarence-page/ct-
           | pers...
           | 
           | Cambridge Analytica made a personality quiz app and in the
           | terms stated the data would be used only for academic
           | purposes. The creator, the app, the theme of the app, the
           | name of the app, and the purported use of the data wouldn't
           | have indicated any kind of political purpose to any users,
           | while the Obama campaign app was clearly attributed to the
           | Obama campaign and the purpose was evident.
           | 
           | There was some potentially questionable behavior due to the
           | app's ability to scrape info about all of the app users'
           | friends, but the app did request permission to see users'
           | friends lists beforehand. I don't know the exact details of
           | the nature of that scraping or how it was presented to users,
           | but even if ones assumes they also did something shady,
           | Cambridge Analytica was definitely sneakier and behaving more
           | like a private intelligence agency than a campaign
           | contractor/advisor.
           | 
           | edit: I don't know enough about the facts of the use of
           | users' friends' data to judge how ethical or consensual it
           | was. I just think that either way, the Cambridge Analytica
           | one is definitely less defensible.
        
             | disgruntledphd2 wrote:
             | So the friends list thing, is in general, the real privacy
             | violation, and it occurred in both cases.
             | 
             | They Obama app didn't have to ask for this, as it was the
             | default permission given to all apps till 2014 or 15.
             | 
             | Like, CA were selling snake oil, as none of their data was
             | actually useful for the purposes they claimed it was.
             | Speaking as someone who's collected personality data and
             | worked in advertising.
             | 
             | Their real service was catching political opponents in
             | compromising positions, which is why they actually got shut
             | down.
             | 
             | I think my major point is that the same tools can be used
             | for either good or less good purposes, and we should aim to
             | either prevent or allow these based on better standards
             | than our liking or disliking of the people involved.
        
         | strict9 wrote:
         | One program exfiltrated user data without user knowledge, in an
         | effort to elevate some in power an attack others in power. The
         | other program exfiltrated user data that the user explicitly
         | opted in to, likely in an effort to expose the uneven
         | enforcement of ads, among other things.
         | 
         | > _The personal data of up to 87 million Facebook users were
         | acquired via the 270,000 Facebook users who used a Facebook app
         | called "This Is Your Digital Life." By giving this third-party
         | app permission to acquire their data, back in 2015, this also
         | gave the app access to information on the user's friends
         | network; this resulted in the data of about 87 million users,
         | the majority of whom had not explicitly given Cambridge
         | Analytica permission to access their data, being
         | collected."_[1]
         | 
         | > _When Facebook said Ad Observer was collecting data from
         | users who had not authorized her to do so, the company wasn 't
         | referring to private users' accounts. It was referring to
         | advertisers' accounts, including the names and profile pictures
         | of public Pages that run political ads and the contents of
         | those ads._[2]
         | 
         | 1.
         | https://en.wikipedia.org/wiki/Cambridge_Analytica#Data_scand...
         | 
         | 2. https://www.protocol.com/nyu-facebook-researchers-scraping
        
         | ceejayoz wrote:
         | There's a bit of a difference between getting data for public
         | interest purposes and getting it for making profits.
         | 
         | For example, the Pentagon Papers can be a good thing while not
         | being super happy with sneaking classified documents to China
         | for pay.
        
           | tablespoon wrote:
           | > There's a bit of a difference between getting data for
           | public interest purposes and getting it for making profits.
           | 
           | I'd say what Cambridge Analytica did was worse than that:
           | they weren't transparent about their purposes, which was to
           | manipulate people. AdObserver seems like they're being
           | transparent, not trying to manipulate anyone, and working for
           | the public interest.
        
             | bryan_w wrote:
             | The Cambridge Analytica data started off as a university
             | research project too...until the data was sold.
             | 
             | FB got a $5 billion fine for being duped in 2015. It
             | doesn't surprise me they are trying to prevent that again
        
               | tablespoon wrote:
               | > The Cambridge Analytica data started off as a
               | university research project too...until the data was
               | sold.
               | 
               | The other salient factor is Cambridge Analytica's
               | Facebook data was on individuals, AdObserver's data is on
               | _ads_. There 's no privacy issues with it at all, and
               | there are serous public policy issues with giving
               | advertisers "privacy" like individuals have.
               | 
               | > FB got a $5 billion fine for being duped in 2015. It
               | doesn't surprise me they are trying to prevent that again
               | 
               | I highly doubt that's the actual reason they're doing
               | this, though that might be their PR rationalization for
               | these actions. It's far more likely they're really doing
               | this because they hate outside scrutiny that they can't
               | stage manage. They want to be able to tell the story that
               | they're effectively combating problems X, Y, and Z; and
               | they don't want anyone to have the data to refute them if
               | they're not actually doing that.
        
           | prostoalex wrote:
           | > There's a bit of a difference between getting data for
           | public interest purposes and getting it for making profits.
           | 
           | Once the data is scraped and stored someplace, you lose
           | control of its purpose. An unscrupulous employee, accidental
           | leak, lax data security practices or a targeted attack will
           | some day enable more nefarious uses of it.
        
       | timdorr wrote:
       | Here's Facebook's blog post about this situation:
       | https://about.fb.com/news/2021/08/research-cannot-be-the-jus...
        
         | AlexandrB wrote:
         | > Research Cannot Be the Justification for Compromising
         | People's Privacy
         | 
         | Right. Ad revenue is the only acceptable justification for
         | compromising people's privacy.
         | 
         | Edit: Although research _is_ a valid justification for
         | compromising people 's wellbeing[1]
         | 
         | [1]
         | https://www.theatlantic.com/technology/archive/2014/06/every...
        
       | xkcd-sucks wrote:
       | Bigger question is _how_ they found the accounts to ban - did
       | they just get the social graph of this outspoken researcher or is
       | the extension itself distinguishable based on its activity?
        
         | tablespoon wrote:
         | > Bigger question is how they found the accounts to ban - did
         | they just get the social graph of this outspoken researcher or
         | is the extension itself distinguishable based on its activity?
         | 
         | It sounds like they may have just banned the researchers
         | associated with the, but not the app itself. She's still asking
         | people to install it:
         | https://twitter.com/LauraEdelson2/status/1422742671957843971.
        
       | literallyaduck wrote:
       | "Their platform their choice", right? Maybe the researchers can
       | make their own platform. Remember when everyone said that about
       | people they didn't like?
        
         | ceejayoz wrote:
         | Don't mistake "it's legal" for "it can't be criticized".
         | 
         | "Their platform, their choice" tends to refer to the legal
         | situation (especially when folks make wild claims about
         | Facebook being subject to the First Amendment), not the moral
         | one. Their choices can be _bad_ , and we can be critical of
         | them.
        
           | luckylion wrote:
           | To be fair, in 90% of the responses the GP is referencing, it
           | totally also means the moral one. "Just build your own
           | platform" is a common response.
        
             | drdeca wrote:
             | I don't know about quite 90% (though maybe that was meant
             | as hyperbole), but yes, it does seem a common occurrence to
             | defend bannings and such on similar grounds, often by
             | conflating "legally permissible" with "appropriate".
             | 
             | That being said, while I certainly agree that decisions
             | about how to run a platform can be such that they should be
             | legal, but also shouldn't be done,
             | 
             | I also feel that the range of what can be appropriate for a
             | platform to require/do can be fairly large, so long as they
             | produce the appropriate expectations among users and others
             | who interact with the platform (and not just by drastically
             | expectations as immediate preparations for changing
             | policies in big ways either)
             | 
             | So, my view on the " 'just make your own platform' "
             | complaint, is a bit mixed?
             | 
             | Obviously the neutrality of the base-layers is highly
             | important..
             | 
             | but if they weren't, would the solution be to make a
             | replacement that was, or to somehow get those in control of
             | the base layers to behave neutrally?
             | 
             | Well, the degree of lock-in at base layers would presumably
             | be basically just as high as it is now (though, I would
             | expect a typical world with less neutral base layers would
             | also have more centralization of the base layers? Not sure
             | how that would influence this.)
             | 
             | Well, at least if the base layers were govt controlled,
             | then I guess there would probably be an obligation for it
             | to be neutral, not just something desirable. Perhaps if it
             | were privately owned by a single org, due to being so
             | critical, it would be justified for govt to (with
             | appropriate compensation) ,
             | 
             | Ok nvm idk what I'm talking about
        
               | luckylion wrote:
               | > Ok nvm idk what I'm talking about
               | 
               | I think you do, at least as much as anyone who doesn't
               | have a knee-jerk reaction and sees it as a binary issue.
               | I believe it's a tough question and you can think of
               | arguments for and against on many levels, and there isn't
               | a perfect solution that doesn't have massive trade-offs
               | and large side-effects.
               | 
               | On the one hand, I agree that public services with forced
               | neutrality might be a good approach, on the other hand
               | I'm not sure governments should get involved in something
               | that is already a mess. It might become a larger mess,
               | that also works slower and would look more and more like
               | something out of Kafka's nightmares.
        
       | nonfamous wrote:
       | Hope none of those journalists had family photos or other
       | memories in their now-banned personal Facebook accounts. And if
       | they had an Oculus, that's basically useless now as well.
        
       | 0x0000000 wrote:
       | Title seems incorrect here, it looks as though individual users
       | had accounts banned, not the AdObserver extension itself.
        
         | tester34 wrote:
         | doesn't it imply it in practice?
        
           | 0x0000000 wrote:
           | Considering the fifth tweet in the thread is Laura directing
           | folks where to install AdObserver, no I don't think that is
           | implied at all.
        
         | ProAm wrote:
         | Facebook cannot ban a browser extension
        
           | 0x0000000 wrote:
           | Right, which is why I said the title is misleading.
        
             | ProAm wrote:
             | Ahh I misunderstood your comment.
        
       | anigbrowl wrote:
       | I suspect we'll see an increase in adversarial scraping in the
       | coming years, since it's (relatively) easy and legal. Probably
       | more interesting research or at least datasets will come from
       | researchers outside academia, since companies like Facebook will
       | use their financial muscle to dissuade institutions from
       | operating research projects like this by offering or withholding
       | donations.
       | 
       | More likely datasets will come from where they come from, and if
       | their reliability can be validated or seems sufficiently credible
       | analysts will follow. I don't see adversarial research as
       | necessarily unethical, any more than it's an unethical attack on
       | industry to study pollution patterns.
        
       | ndkwj wrote:
       | Seems reasonable that using a tool that massively gathers and
       | exfiltrates data from a website gets you banned.
        
         | knowtheory wrote:
         | Yeah, that's falling directly into Facebook's talking points.
         | It's a web extension, anybody can inspect the source. It
         | doesn't do what Facebook is claiming. The NYU team bends over
         | backwards to ensure that no personally identifying information
         | about other users gets captured.
         | 
         | The privacy leak that Facebook is so concerned about is
         | actually the identity of advertisers on their platform.
         | 
         | https://twitter.com/issielapowsky/status/1422879438765797380
        
           | amadeuspagel wrote:
           | > The privacy leak that Facebook is so concerned about is
           | actually the identity of advertisers on their platform.
           | 
           | Yeah? That also seems like a completely legitimate concern.
        
             | ceejayoz wrote:
             | But it's _public info_?
             | 
             | > When Facebook said Ad Observer was collecting data from
             | users who had not authorized her to do so, the company
             | wasn't referring to private users' accounts. It was
             | referring to advertisers' accounts, including the names and
             | profile pictures of public Pages that run political ads and
             | the contents of those ads.
             | 
             | It's all on https://www.facebook.com/ads/library/. Scraping
             | just lets them analyze it.
        
               | amadeuspagel wrote:
               | The comment that I'm replying to argued that facebook is
               | concerned about the privacy of advertisers, and I argued
               | that this concern is legitimate. If you don't agree that
               | facebook is concerned about the privacy of advertisers,
               | maybe you should reply to the comment that actually made
               | this claim?
        
               | ceejayoz wrote:
               | I don't agree with _your_ claim. I 'm arguing the concern
               | is _not_ legitimate.
        
           | jensensbutton wrote:
           | So Facebook, who just paid a 5 billion dollar fine to the FTC
           | for allowing exactly what these researchers are doing, should
           | adopt a policy of examining the source code of every update
           | to any extension used for scraping data to determine whether
           | it's allowed or not? Is that the other option?
        
           | secondcoming wrote:
           | But was that data still collected without consent?
        
             | input_sh wrote:
             | I'd say installing an extension is a pretty big sign of
             | consent. It's named clearly and clearly describes what it
             | does in the first sentence of the description:
             | 
             | > A browser extension to share data about your social feed
             | with researchers and journalists to increase transparency.
             | 
             | I'd call that type of data gathering quite consensual.
        
               | marketingtech wrote:
               | You're also granting the extension access to your
               | friends' data, given that it can see everything that you
               | can. Your friends consented to show that data to you, but
               | not to the extension developer. Your friends' consent is
               | not transitive.
        
               | anigbrowl wrote:
               | When I was a regular FB user I understood when I share
               | stuff with friends that it might be visible to their
               | browser extensions. Ubt I feel your comment is sort of
               | misdirection as the purpose of the browser extension was
               | to collect information on _ads_ in peoples feed.
               | Advertisers might show up in your feed, but that doesn 't
               | mean they're your friends, even if you consented to
               | receive ads by signing up with a petition organizer or
               | political campaign.
        
       ___________________________________________________________________
       (page generated 2021-08-04 23:02 UTC)