[HN Gopher] Google has been testing a replacement for third-part...
___________________________________________________________________
Google has been testing a replacement for third-party cookies
Author : Geeek
Score : 190 points
Date : 2021-01-25 13:50 UTC (9 hours ago)
(HTM) web link (www.axios.com)
(TXT) w3m dump (www.axios.com)
| baybal2 wrote:
| FYI this was already quietly shoved into Chrome 80
| holtalanm wrote:
| honestly the thing that bugs me the most about the article is
| cookies == tracking.
|
| Sure, cookies are used for tracking, but they are also used for
| authentication, which is something that nearly every webapp needs
| to do.
|
| I just think that, due to articles like this, cookies end up
| being viewed as nothing but bad, when they are an important tool
| for the web when used properly.
|
| More on-topic of the article:
|
| this doesn't look like it really changes anything, to me. Like,
| so instead of cookies being used to track your data, they use a
| _browser extension_?? that is potentially even _more_ invasive.
| Sure, if it does what they say it will do, it kind of obfuscates
| your personal data. Really, what people want is just....less ads.
| Less targetted ads. This doesn't achieve that.
| skybrian wrote:
| Tracking and authentication aren't quite the same, but they
| aren't independent either. If you are logged in all the time
| (like many of us are for Hacker News), then your actions can be
| tracked pretty well.
|
| That's kind of the point of being logged in, to let the website
| know who you are.
| Spivak wrote:
| For basically everyone cookies == tracking. That is pretty much
| their only user-visible purpose. I think the best way forward
| would be to heavily restrict the persistent data that websites
| (that aren't installed as apps) can store in the browser to
| basically just an authentication token that is only sent by the
| browser and not accessible to JS.
|
| It's kinda silly that we can't manage our website logins via
| the browser without clearing all the cookies for a site.
| wyldfire wrote:
| > Sure, cookies are used for tracking, but they are also used
| for authentication, which is something that nearly every webapp
| needs to do.
|
| What if we could move authentication or more specifically the
| state held in the client for authentication to some other
| mechanism? Could we pitch cookies? Could we make this switch
| without making it somehow possible for advertisers to switch to
| the new mechanism?
| kenniskrag wrote:
| Already exists. Custom HTTP Header with a JWT token for
| example. Also in the body of a post request can be the auth
| data. In the URL would also be possible but is a security
| risk due to e.g. browser history.
| kenniskrag wrote:
| advertisment still works, because the site can just execute
| a js and make a post request to the advertisment company.
| [deleted]
| tester34 wrote:
| You have to attach that "JWT token" (heh) with e.g JS which
| makes it vulnerable to XSS, doesn't it?
| kenniskrag wrote:
| With xss you can also inject the code. So no cookie
| extraction needed imho.
| heythere22 wrote:
| Most session providers use a session cookie and store all the
| required values on the server. Moving that to the client will
| require a lot of additional javascript. And will also make
| sure that browsing without javascript is definitely
| impossible. Not to think about the amount of additional
| security holes that would open.
| fendy3002 wrote:
| IMO one step of tracking is to authenticate the tracked, so
| whatever the authentication method is, it'll be used as
| tracking method.
| peeters wrote:
| There's no reason to if you specifically block third-party
| cookies, which is what is being suggested (and what Axios
| muddies considerably by using "cookie" and "third-party
| cookie" interchangably).
|
| Unless you're going to throw out local storage and custom
| request headers, getting rid of cookies isn't really going to
| do anything except make the same thing less secure (since you
| won't be able to benefit from the HttpOnly flag).
| freeone3000 wrote:
| Authentication, notably OAuth, from all of the large major
| providers comes from a domain other than the site content
| domain. Blocking third party cookies indiscriminately will
| render you unable to log into facebook, google, twitter, or
| any microsoft service, or anything that uses those tokens.
| michaelt wrote:
| No it doesn't.
|
| The user clicks 'log in with google', their browser gets
| forwarded to whatever.google.com, the (now first party)
| cookie gets checked, then the user gets forwarded back to
| your site with the access token as a parameter in the GET
| request.
|
| No third party cookies needed.
| cesarb wrote:
| > Blocking third party cookies indiscriminately will
| render you unable to log into facebook, google, twitter,
| or any microsoft service, or anything that uses those
| tokens.
|
| I block third party cookies indiscriminately, and I never
| had issues logging into Facebook, Google, Twitter, or any
| Microsoft service except for Microsoft Teams. This
| includes logging into third-party services using the
| Google login.
| peeters wrote:
| 3rd-party cookies aren't actually intrinsic to OIDC auth
| flows. They might be used by some implementations under
| some flows, but they're not core to the spec.
|
| Typically the user agent will redirect to a Google/etc
| login page where it'll have access to first-party
| cookies. Then will redirect back to the site which
| requested authentication, passing state in the query
| params. It's only when you get into using stuff like Okta
| as a delegated authentication service do you run into
| trouble with 3rd party cookies.
|
| Edit: As an example, I just logged into Stackoverflow
| using Google as the authenticator, with umatrix blocking
| 3rd-party cookies. Worked without a hitch.
| drtillberg wrote:
| The extent to which my browser configuration breaks the
| websites on your list is _my_ yardstick for _success_.
| tomjen3 wrote:
| Install client SSL certificates for the websites that needs
| would be a start, or cookies die within 60 seconds unless a
| password has been entered on the site.
| freeone3000 wrote:
| what if we remove the JS, and just had the web browser echo a
| defined string back to the server? if this token uniquely
| identified the session, we could safely store the data
| server-side, with minimal leakage to other sites and no need
| for code execution at all!
| rank0 wrote:
| I can't tell if you're joking or not but this clearly
| already exists via HTTP headers.
| freeone3000 wrote:
| Yes, including the echo, this is literally cookies.
| cestith wrote:
| It could be done, but first-party cookies aren't really the
| issue. Most browsers have the option to disable third-party
| cookies, but many ad-supported sites throw a fit if you
| browse them that way. I think the goal is to introduce some
| alternative, then switch blocking third-party cookies to a
| default. Finally cookies could be a first-party only
| solution.
| mPReDiToR wrote:
| Do sites like this deserve my eyeballs?
|
| If the site is essential I wipe the cookies before and
| after reading.
|
| Most of the time I skip the site in favour of another one.
| lxe wrote:
| This doesn't seem to be at the same technology layer as
| "cookies", as this seems to be a Chrome-internal (local, which is
| good, but is this a guarantee?) API that uses your search history
| and other things to generate a 'cohort' which is an ID of some
| sort that you can send over as a part of requests to the
| advertiser URL's.
|
| https://github.com/WICG/floc
| nelgaard wrote:
| Already most of our CPU cycles are eaten up by ad-frameworks in
| our browser.
|
| Now Google want to offload Machine Learning to our browser. That
| will be bad for battery life, electricity bills, and the
| environment.
|
| On the other hand I use free software. So I can make a version of
| their extension that just claims that I am obsessed with Ironing.
| That will also make it easier for the Ad-blocker to do its
| filtering.
| ogre_codes wrote:
| This requires browser integration. What concerns me is this
| doesn't talk about how Google plans to track non-Chrome users.
| Because you know Google isn't going to just stop tracking the
| other 40% or so of web users.
|
| The other big thing that concerns me about this is how it still
| allows for some of the worst abuses. They are still going to
| possess entirely too much information about people and will
| continue to sell advertising that takes advantages of that
| information.
| devops000 wrote:
| What if the user opts out at browser-level? It looks Google
| business will be very dependent on chrome product.
| jrochkind1 wrote:
| > and would've been paralyzed without the ability to use some
| sort of anatomized data to target people with ads.
|
| anatomized? Is that a typo for anonymized? Or does this mean
| something?
| acvny wrote:
| So they are now rebranding the fact that they are trying to
| monopolize the ad space?
| downandout wrote:
| In 1997, I had a meeting with Netscape executives about a similar
| technology I had created out of concern for the privacy
| implications of cookies. I called it LAD (local advertising
| decision). The server would send a script down to the browser
| saying "if the user meets X criteria, show this ad, else show Y"
| and so on. It could use browser history, installed software, and
| other factors for targeting.
|
| At the time I was laughed out of the room. Turns out I was just
| 24 years too early.
| cblconfederate wrote:
| Isn't that what Brave browser is doing?
| coldtea wrote:
| We wont solve this issue until we stop viewind advertising and
| increased consumption as healthy...
| lemax wrote:
| Axios really needs to bring on some literate technical advisory.
|
| _" Cookies are considered third-party data, or user data that's
| collected indirectly from users via browsers or websites."_
|
| This statement seriously requires qualification. This is exactly
| what contributes to unreasonable regulation and confused users.
| alexfromapex wrote:
| I think independent groups would be much better advocates for
| privacy technology than Google which has a huge conflict of
| interest
| godelmachine wrote:
| While we are discussing this, just wish HN readers to shed more
| light on a practice which I diligently follow.
|
| Whenever I visit any website, courtesy the GDPR laws, we are
| asked to request the terms and cookies. I make it a point to
| disable all cookies (barring the strictly necessary ones),
| partners and also the "Legitimate Interest" section, where I
| click "Object all", and then click "Save and exit".
|
| However, on many websites I don't see any option to "reject" or
| "object" to cookies, partners, vendors and especially legitimate
| interest. Particularly concerned about Legitimate Interest since
| the number of vendors there is humongous.A good example of a site
| where we cannot choose would be the BBC[1]. We get an option only
| to read their terms and conditions but no option to reject and
| object.
|
| 1) Can anyone please guide how to reject to cookies on such sites
| where they don't have a reject option present?
|
| Also, in my iOS, in Safari settings, I have chosen "Block all
| cookies" to yes.
|
| 2) How far will blocking all cookies safeguard me from
| unscrupulous cookies? If my blocking all cookies is enabled in
| safari settings and suppose I visit some malicious site and
| accept their cookies, would the owners of malicious site be able
| to do anything sinister or adversarial to my privacy and
| integrity? Will the be able to breach my security?
|
| Ref. - [1]https://www.bbc.co.uk/
| frongpik wrote:
| Anything short of uBlockOrigin/uMatrix with JS disabled won't
| work. Not only you need to block cookies at the uBO level, you
| need to block JS because it can write cookie-like IDs to
| persistent storage such as indexeddb.
| godelmachine wrote:
| Thanks for your suggestion.
|
| But wouldn't blocking JS make all websites dysfunctional for
| me?
|
| Also, how do I get rid of persistent storage like indexDB as
| you have highlighted?
|
| Does clearing cache and cookies from browser help?
| etxm wrote:
| Reading this article, which is targeted at advertisers, feels
| like how I imagine a cow would feel looking in the window of a
| butcher shop.
| m_eiman wrote:
| "Hi early 2000s computer user, let's make a deal:
|
| I get full access to everything you do online, and get to do
| anything I want with the information. Perhaps I'll use it to
| maybe target ads slightly better in some cases, and put myself
| into every value chain you're involved with so I can get a cut at
| every step.
|
| Oh, and I'll do my best to move all computing online, so that
| 'everything you do online' equals 'everything you do with a
| computer'.
|
| In exchange you'll get a web browser that is at times more
| performant than the others. Hell, I'll even throw in a free email
| account (where I can gather all the best bits of info)!
|
| It's a pretty good deal, don't you think?"
| judge2020 wrote:
| "if it means I don't have to pay $20/year for more than 2mb of
| hotmail storage, i'm all for it!"
|
| https://www.pcworld.com/article/116657/article.html#drr-cont...
| (250mb free increase was suspected to be a response to Gmail)
| fixmycode wrote:
| I'm sorry but I fail to see the point of this. you can choose to
| disable cookies, will you be able to disable this new thing? if
| so, what's the big deal about it, other than the same principle
| with a different name, probably to avoid some EU legislation.
|
| Advertisers and trackers have been doing the same thing this
| thing is supposed to do for years. And where will they implement
| it? the only way would be at the application level, so every
| browser now also has to implement internal tracking services to
| aggregate all the data in their flocs, to then come back to the
| user to spice up their request? come on...
|
| I'll keep supporting efforts to make the Internet a more privacy
| focused place. Advertisers have been buying TV ads for decades
| and I my TV hasn't asked me what I want to share with it, yet.
| kag0 wrote:
| This is related to my main question about this. What if
| browsers just... don't implement this?
|
| What makes Google think that Apple or Mozilla are going to add
| this to their browsers?
| tpoacher wrote:
| Federated means no "theoretical" access to the data. It doesn't
| mean no "practical" access.
| kmeisthax wrote:
| FLoC is an engineering solution to a political problem.
|
| The problem with targeted advertising isn't the use of cookies,
| the problem with targeted advertising is the targeting. It
| doesn't matter if you're using fancy machine-learning and on-
| device targeting to avoid technically collecting targeting data.
| People don't like seeing their web history funnel into their
| advertising.
| tomjen3 wrote:
| The problem is that most ads are for shitty products you don't
| need and those crowd out the ads for the few non-shitty
| products that makes your life better.
| maria_weber23 wrote:
| > FLoC is an engineering solution to a political problem.
|
| Yes.
|
| > the problem with targeted advertising is the targeting
|
| Is it? The problem with advertisement is the advertisement. I
| don't like to see ads at all, but one thing I know for sure,
| I'd take targeted ads at all time over random ads.
|
| > People don't like seeing their web history funnel into their
| advertising.
|
| No. This is the problem YOU have with it. Most of the non-tech
| people I know have no idea what you are even talking about.
| [deleted]
| gwbas1c wrote:
| You must have no shame.
|
| My wife buys diapers online, then ads for diapers show up on
| my computer.
|
| I go shopping for underwear on one device, and then when
| reading a technical forum with co-workers on a different
| device, there's ads with people just wearing underwear.
|
| The tracking is extremely excessive.
| IfOnlyYouKnew wrote:
| That's just the algorithm telling you to be more involved
| in childcare.
| jokethrowaway wrote:
| uBlock is free
| criddell wrote:
| I don't think the choice is between personally targeted ads
| and random ads though. There are middle options.
|
| For example, if I search for "how to do a Subaru oil change"
| it's the perfect opportunity for the search engine to show me
| ads related to motor oil, Subarus, car maintenance videos,
| etc... If I opt in to sharing my location with the search
| engine they could also show me ads from local repair shops
| and car dealers.
|
| Later on when I'm reading an article about dog training, I
| don't want to see ads about fixing my car, show me ads
| related to dogs. Use the context of the page for targeting.
| ssivark wrote:
| Sure, but what about showing you ads for WhatsApp when
| you're searching for Signal?
|
| Where does contextual advertising/influencing switch from
| being helpful to being gaslighting?
| renewiltord wrote:
| Sure, search does really well for contextual because there
| is lots of intent. But most other contextual is not like
| that: people aren't looking to buy something.
|
| Your normal usage isn't dog training or subaru oil changes.
| It's idle nonsense like whether Kim Kardashian is angry
| with Courtney Cox or whatever. There's not enough to sell
| you that's contextual. People have an idealized vision of
| what they spend time on. It's nothing like this productive
| stuff you're talking about.
| criddell wrote:
| > There's not enough to sell you that's contextual.
|
| Next time you are in a waiting room flip through the
| magazines that are sitting there. The ads are targeted to
| the likely readers of those magazines.
| renewiltord wrote:
| No volume. Hard for small guy to use.
|
| Online targeted ads are way better.
|
| Not to be annoying, but advertising is serious money. If
| you think you can do good contextual advertising you will
| become rich very easily. Anyone will. It's hard for me to
| believe that no one is doing this supposedly easy and
| effective thing well since all the incentives are there.
| criddell wrote:
| > Online targeted ads are way better.
|
| Sure, but there are a lot of ethical problems with them.
|
| Imagine starting a service that paired individual
| shoppers with a passive handler to follow them around a
| shopping mall to build dossiers including:
| * everything they pick up and look at * what they
| eat in the food court * the clothing styles and
| sizes they try on * what their transportation to
| the mall was * their race, gender, age, and
| apparent ethnicity * etc...
|
| How long would you put up with that?
| dvdkon wrote:
| Can we really be sure that current user-targetet
| advertising is better than contextual advertising? It
| seems to me that nobody wants to step out of the safe
| model of targeted advertising, since it does work well
| enough, and so there haven't been big enough attempts at
| contextual advertising to really say one approach is
| better. As an anecdotal test, I went to cnx-software.com
| and disabled my adblocker. The site has roughly 13 ads, 3
| served by Google, the rest custom. Google tried to sell
| me toothpaste, while the custom ads are for things I'm
| actually tangentially interested in, like SoMs, embedded
| devices and assembly services. This kind of advertising
| obviously has a large overhead for the site admins right
| now, but I could definitely imagine an AdSense-like
| service that would distribute ads based on processing the
| site's contents.
| jokethrowaway wrote:
| Privacy advocates need an excuse to advocate for privacy and
| impose more regulations for the Common Good.
|
| Then we end up with crap like the cookie banner - which
| completely ruined the internet for me.
|
| There is not a day I don't have to close one of those and
| browser extensions to block them are nowhere as good as
| blocking ads. Not to mention, accepting all the cookies is
| one click, while rejecting them either require 1 minute of
| thinking or browsing through popups and menus.
|
| The fact that 99% of the people don't care is over their
| head. If people cared they would be using duckduckgo.
|
| Funnily enough, I use duckduckgo and I think it's a great
| Google replacement - but I don't care about being tracked, I
| just appreciate the features (especially code snippets in
| search)
| bluesign wrote:
| But the problem is we are not getting targeted ads. We are
| getting semi-targeted ads.
|
| If someone is paying 10 usd for conversion and you have
| chance to convert 1%, another one has 90% chance to convert
| but paying lets say 0.10 usd, even if you have 90 times
| better ad, ad network will show you the 1% one.
| interestica wrote:
| We live in the timeline where "Alphabet to replace cookies" is a
| legit headline.
|
| What is the public understanding/perception of cookies? The past
| couple of years since the implementation of the GDPR has probably
| been the biggest and weirdest public education campaign (done
| entirely through brief pseudo-consent popups).
| dessant wrote:
| This proposal coupled with phasing out third-party cookies
| inconveniences competitors, while allowing Google to continue
| gobbling up user data without disruption, because their tracking
| capabilities are way past needing any cookies, or this new cohort
| API.
| alisonkisk wrote:
| The article is about a _non_ -tracking capability to collect
| less user data.
|
| Inconveniencing trackers is good for privacy, not bad.
|
| Anyway, mods, here's a much better article that has more than
| one line of vague content: https://github.com/WICG/floc
| gnud wrote:
| This looks to me as a way for Google to drastically increase
| their reach and track even more data about their "users".
|
| They want the browser to "discover" the users interests
| automatically during browsing. For a page to be excluded from
| this, the page author would have to set a new policy header.
|
| And then your browser reports these interest to whatever
| tracker (for example Google) asks for the information.
|
| Suddently Google can learn about what you're browsing even if
| GA is blocked.
| dessant wrote:
| Google only cares about privacy, when it disproportionately
| hurts the competition. Meanwhile they are using their
| leverage to infect web standards with a tracking proposal,
| and they market it as a privacy win.
|
| Features used _exclusively_ for tracking have no place among
| web standards, cohort-based or otherwise.
| tyingq wrote:
| I think that's a good observation. That they don't need cookies
| or this proposal. So, anything that performs less well than
| cookies hurts them less than their competitors.
| up2isomorphism wrote:
| Google has already lost the trust of being a company that
| remotely respects anybody's privacy, if any at all. I would
| rather spend time on some other privacy proposals.
| throw14082020 wrote:
| > The Sandbox isn't about your privacy. It's about Google's
| bottom line. At the end of the day, Google is an advertising
| company that happens to make a browser.
|
| It's worse than that. Google is an advertising company that makes
| a browser (63.38% of browsers globally) and mobile operating
| system (72.48% of phones globally) to vertically integrate,
| controlling your privacy choices. They're also trying their hand
| at PC's (ChromeOS, 1.72% globally). They invent technology across
| the stack, providing software for free or paid, and open sourcing
| some to commoditise the technology and to starve competition. I'd
| be interested to see how many people use Gmail.
|
| https://gs.statcounter.com/browser-market-share
| https://gs.statcounter.com/os-market-share/mobile/worldwide
| https://gs.statcounter.com/os-market-share/desktop/worldwide...
| dilap wrote:
| "Serial killer may have found a life-friendly substitute to
| knives."
|
| No but seriously, does being in a group of "thousands" of people
| really preserve privacy particularly well? It seems quite likely
| that with groups that small, membership itself could be
| considered privacy-compromising, e.g., a group of people that all
| have some medical condition.
|
| At the most fundamental level, I feel like if you know which
| advertisments are targetted to me, and those advertisements are
| well-targeted, then my privacy has been invaded.
|
| It seems to me there is a fundamental conflict between good
| targetted ads and protection of privacy.
| f430 wrote:
| > The company said Monday that tests of FLoC to reach audiences
| show that advertisers can expect to see at least 95% of the
| conversions per dollar spent on ads when compared to cookie-
| based advertising. FLoC uses machine learning algorithms to
| analyze user data and then create a group of thousands of
| people based off of the sites that an individual visits. The
| data gathered locally from the browser is never shared.
| Instead, the data from the much wider cohort of thousands of
| people is shared, and that is then used to target ads.
|
| If I was an advertiser I would have serious doubts about this,
| especially after the fact that ad spend on popular platforms
| have had no impact on many firm's bottom line.
|
| I guess this is a response to all the pushbacks and dwindling
| PPC revenues from an increasingly wary advertisers who have
| quite possibly been duped into transferring their cash to
| Google & others over the decade.
| joshuamorton wrote:
| What? No. It's a response to potential bans on same site
| cookie access. The thing you quote says that this is worse
| than conventional targeted ads.
| dataminded wrote:
| Google isn't sharing your individual data but they are still
| collecting it and storing it. This feels like a non-improvement
| to me.
| ignoramous wrote:
| > _It seems to me there is a fundamental conflict between good
| targeted ads and protection of privacy._
|
| True, but it is too much to ask Google to throw "the baby out
| with the bath water", as it were. For all their faults, I am
| encouraged that Android and Chrome, if no other product team at
| Google, is pioneering Federated Learning [0], Differential
| Privacy [1], and now are pushing ahead with Privacy Sandbox
| [2]. I wholeheartedly agree it simply isn't enough, but it is
| substantially better than what's in-use right now.
|
| Like everyone else though, I am worried for the same reason I
| dislike _AMP_ (accelerated mobile pages) despite it bringing
| noticeably better user-experience for many: Google has this
| nasty tendency to make things seem more "open", "benevolent",
| and "private" than they really are.
|
| [0] https://federated.withgoogle.com/#learn
|
| [1] https://www.chromium.org/developers/design-documents/rappor
|
| [2] https://news.ycombinator.com/item?id=20767891
| Super_Jambo wrote:
| > True, but it is too much to ask Google to throw "the baby
| out with the bath water", as it were.
|
| No it isn't, the 'baby' of targetted adverts is a net
| negative for society. We'd be net better off if most googlers
| just retired and spent their days digging holes and filling
| them in again. It is not "too much to ask" that people stop
| messing up society even if it's making them billions.
| fooker wrote:
| You mean, _you_ would be better off. The 3 billion people
| who can not afford to pay 200$ per month for various
| software will not be better off.
| zffr wrote:
| This just sounds like Google is building an API for browser
| fingerprinting. Advertisers send Google data, and get back a
| fingerprint of a user.
|
| This is only "privacy friendly" because Google limits the
| accuracy of the fingerprint provided to advertisers by bucketing
| users into cohort groups. These groups are supposed to be large
| enough to prevent advertisers to identify individual users.
|
| Google would still retain the ability to uniquely identify
| individual users.
| cpeterso wrote:
| Chrome's "Privacy Sandbox" mentioned in the article will limit
| JavaScript's access to APIs that expose fingerprinting entropy.
| Thus publishers will feel pressure to use Google's advertising
| services and Chrome's FLoC because other ad networks won't
| monetize as well since they can't use third-party cookies or
| fingerprinting in Chrome.
| jahewson wrote:
| I doubt it. Safari already blocks 3rd party cookies, so this
| situation already exists. Every ad network will implement
| FLoC, it's an open standard - they'd be crazy not to.
| zffr wrote:
| Great point. This would give Google's advertising services an
| unfair advantage over its competitors.
|
| This really only seems beneficial to Google, not to other
| advertisers, or to end-users.
| jefftk wrote:
| _> Google would still retain the ability to uniquely identify
| individual users._
|
| In the proposal, the non-clustered data does not leave the
| user's device: https://github.com/WICG/floc
|
| (Disclosure: I work for Google, speaking only for myself)
| ocdtrekkie wrote:
| In addition to the whole fox guarding the henhouse issue, this
| doesn't address the primary harms of user tracking: That it's
| just bad for society that people are targeted and advertised to
| on this level, as it fosters filter bubbles and encourages
| unhealthy behaviors.
|
| Tracking a group of 1,000 people to cater bad political ads isn't
| meaningfully better than targeting 1,000 individuals with bad
| political ads.
|
| Targeted advertising needs to be treated like unfair gambling
| practices. Banned across the board, and the industry that remains
| needs to be heavily regulated and forced to be completely
| transparent about the process.
| qwertox wrote:
| I have no issues with targeted advertising as long as the ad's
| topic is deduced from the context of the page I'm visiting.
|
| Cookies aren't required for that.
|
| If I buy a MTB magazine, I do expect to see some ads from this
| or that bike-producing company (even though I'd prefer them not
| to be there).
| maweki wrote:
| > Targeted advertising needs to be [...] banned across the
| board
|
| You say that like an absolute that is enforceable. Advertising
| has been targeted since advertising exists. Advertisers have
| been choosing radio or billboard-slots for well over 100 years
| (well, radio for 100, print and billboards probably for
| centuries), using data or educated guesses to reach a target
| demographic. As advertisers now choose on which sites (or not)
| their ads should appear, in order to reach their target
| demographic. Of course, they can choose by a few more criteria
| now. How ubiquitous should a specific ad be, so that it would
| not be "targeted" advertisement?
|
| I think we need legislation, but it's not black and white.
| There's a huge grey line that spans all of advertising history.
|
| Edit: just to be clear, choosing whether to advertise in a
| newspaper (and which) on radio (and which channel), or on
| facebook, is already targeting for a desired audience.
| grishka wrote:
| > You say that like an absolute that is enforceable.
|
| It is possible to make it technically impossible. Not
| allowing third-party cookies is one way. This alone would
| only allow targeting by the coarse location derived from the
| IP and by the user agent string.
| ocdtrekkie wrote:
| I suppose I should clarify: _User_ targeting should be banned
| across the board. _Content_ targeting should not: Feel free
| to put ads next to particular news articles, sites, or TV
| shows.
| tomaszs wrote:
| Cookies work without any third party business involved. The
| proposed solution won't work the same way. It will work only when
| using a third party business servers.
|
| It is not a replacement. It is a proposal how to replace a free,
| standardized and open world wide web feature with a commercial
| service.
| chovybizzass wrote:
| Netscape should have them "herpes" instead of "cookies".
| st3ve445678 wrote:
| So if you just use Firefox or Safari instead this method wont
| work on you?
| foxhop wrote:
| Additionally cookies are not bad, 3rd party cookies are not even
| necessarily bad. Tracking people is bad.
| foolinaround wrote:
| Extensions will spring up that will pollute the local storage to
| help in anonymity, increase the noise and reduce the real value.
| masswerk wrote:
| Besides usual privacy concerns:
|
| Dear advertisers, I do not want to be herded in a bubble
| (designed by you or anyone else), I actually like to know the
| world around me.
|
| (And this is even more valid for the things I'm not that familiar
| with anyway. How would I learn about those segments of reality,
| if not from your adverisment that you would prefer to rather not
| show me?)
| chopin24 wrote:
| Let's dispense with this fiction, once and for all, that
| "targeted" or "customized" advertisements were ever for the
| users' benefit. They are, and always have been, for the benefit
| of Google and the advertisers. Google wants you to believe that
| advertisements are an inevitable, unavoidable feature of the
| web, despite the fact that they didn't want to be in this
| business when they started the company. They even go so far as
| to tug at your heart strings to and say how "hard" they work to
| make sure their product is "safe, unobtrusive, and as relevant
| as possible," implying that capturing your attention to part
| you from your money effectively is the ideal outcome. [0]
|
| This is gaslighting. Interest-based advertising on the web is
| not an immutable feature, a naturally occurring phenomenon.
| It's a scourge invented to further surveillance capitalism and
| it must be abolished.
|
| All this is to say, I'd change your letter to say:
|
| Dear Advertisers:
|
| Stop tracking me or I'll block you entirely at every turn. Your
| business model does not concern me. My attention is not for
| sale. Change, or be regulated out of business.
|
| [0]https://policies.google.com/technologies/ads?hl=en-US
| gerash wrote:
| Whoah, folks like you use free ad supported services but just
| don't like the ad part and don't want to pay either.
|
| Interest based advertising is simply optimizing ads for
| conversion rate. Slow down with all the philosophy.
| chopin24 wrote:
| I don't consent to being tracked, and I don't accept that
| advertising corporations can dictate the terms of how I
| spend my attention. I don't use ad-supported services -- I
| block ads everywhere, and pay for what is valuable to me.
| [deleted]
| st3ve445678 wrote:
| But this technology only works if you use Chrome as your browser
| correct?
| bigsteve90 wrote:
| Good news for google shareholders: looks like google is returning
| back to its roots creating nightmarish ad tech tools to further
| their goal of turning the world into a digital panopticon. Bonus
| points for simultaneously crowding out competitors and further
| solidifying their ubiquity and monopoly.
| danShumway wrote:
| It's possible to imagine an alternative system to FLoC that was
| actually privacy respecting.
|
| Say we had an Open, standardized, human-readable list of
| categories/groups that people could opt into (rather than a bunch
| of on-the-fly groupings determined by an AI). We could give users
| the ability to choose 0-X of those categories that they want to
| associate with. We could even let them choose on a site-by-site
| basis, so they could decide how ads would be targeted (or if they
| would be targeted at all) on _parts_ of the web.
|
| We could build UIs that helped them with that. We could have easy
| ways to opt into or out of categories. We could allow them to
| turn on category suggestions, so with their permission if a user
| visited a site about a specific kind of product, we could show a
| one-click option in the browser to add themselves to an
| associated category and see ads for similar products.
|
| We could allow them to group sites together and say things like,
| "I want news sites that I visit to know that I'm looking to buy a
| specific brand of car, but I don't want any of the car dealership
| sites that I'm looking at to know what brand I want."
|
| For users that don't want that level of detail, we could still
| have a 'smart' system that consumers could run (clientside) that
| looked at the websites they visited, or even more personal data,
| and auto-placed them in categories without them needing to think
| about the system at all. They'd just need to select an option to
| let the browser handle all of their categories for them.
|
| But importantly, all of this would be based on consent. And
| instead of offering users a single choice to opt out, they would
| have an entire spectrum of choices that allowed them to decide
| how they presented themselves online, what specific data they
| shared, and who they shared it with.
|
| If users genuinely benefit from targeted ads, then they'll opt
| into the system and pick categories that are relevant to them and
| send them to sites. If they think Google's data collection is
| accurate, then they'll turn on the smart system in Chrome that
| locally categorizes them. But at any point, for any site, they
| could choose to turn off the data entirely, or to add themselves
| to a specific category, or to remove themselves from a specific
| category. In human-understandable terms, they would know exactly
| what data they were transmitting to websites.
|
| ----
|
| For all that Google says they're working on data privacy, very
| few of their proposals, even their good proposals, approach
| privacy from an angle of giving users more control over their
| identities. Google is still stuck in a world where they think of
| data collection as something that has to happen without the users
| knowledge, without the user's ability to easily inspect what's
| going on, without the user's ability to form multiple identities
| or even to just opt-into the system at all.
|
| What I want is control over my data. And what Google (and
| companies like them) keep on saying is, "we'll be somewhat more
| responsible with your data, but only if we keep control of it."
|
| And this represents a general attitude that comes up in so many
| modern tech products, from Youtube, to social feeds, to modern UI
| design, to device security. These companies are like a
| controlling, overbearing parent. People want agency over their
| ads/recommendations/feeds/etc, but the companies think the
| problem is that they're just not good enough at controlling all
| of that for us. It's a way of thinking about UX/product/process
| that's divorced from user consent and agency as an ideals that we
| should strive towards.
| jahewson wrote:
| > We could allow them to group sites together and say things
| like, "I want news sites that I visit to know that I'm looking
| to buy a specific brand of car, but I don't want any of the car
| dealership sites that I'm looking at to know what brand I
| want."
|
| But this this isn't how ads work. Most ads are served by an ad
| network, such as doubleclick, which does realtime bidding on
| the ad space. The exchange of cookies is between you and the ad
| network, not you and the publisher hosting the ad or you and
| the advertiser who placed the ad.
|
| Otherwise you make some great points and it seems that FLoC
| could well provide such tools for the user, because their
| profile is now rich and client-side instead of being stored on
| some tracker server and keyed by an opaque cookie id.
| cookiengineer wrote:
| What this means is not that google has found a privacy-friendly
| alternative.
|
| It means that Google has found out that among 1000 people, your
| browsing criteria with HTTP headers alone is unique enough to
| identify you with 95% accuracy, which is actually even more
| frightening.
| Ashanmaril wrote:
| Can Google tell me who these people are so we can start hanging
| out? These dudes sound sick
| brabel wrote:
| If you want to protect yourself against that, use FireFox with
| most of the recommendations from here:
| https://www.privacytools.io/browsers/#about_config
|
| This seems to work pretty well for me (Google got really
| confused, thinks I am on Windows now - I am not).
|
| Just don't turn off media.gmp-widevinecdm if you want to keep
| watching DRM-protected content (e.g. Netflix)!
| cookiengineer wrote:
| This is wrong advice. Firefox's unique handling of asset
| loading, including how ETag and Accept headers are processed
| (e.g. loading a png as a <script src> and as an <img src>)
| make Firefox always uniquely identifiable.
| mike_d wrote:
| Are you saying that you can identify Firefox among a group
| of different browsers, or me among a group of different
| users?
|
| If it is the latter, please file a bug against Firefox to
| get it fixed.
| telesilla wrote:
| Would an extension that sets random headers be a solution to
| blurring identity?
| cookiengineer wrote:
| > Would an extension that sets random headers be a solution
| to blurring identity?
|
| Nope, because the order a Browser Engine loads assets is
| different in Chrome vs. Firefox vs. Edgium, too. Combine that
| with Firefox's messed up Accept header and you'll have them
| TOR Browser users, for sure. "@supports" in CSS is
| additionally a very unblockable way to track users, as it
| varies uniquely per-Browser-version as features of CSS get
| implemented and/or get fixed.
|
| Usually traffic analysis for a client (with a specific ETag
| header) is enough to uniquely find out whether it's the exact
| same machine, hence that's what the header is made for.
|
| The approach behind my browser tries to actively modify the
| contents of said malicious HTTP headers and to rewrite the
| HTML, CSS and other assets in order to force-cache everything
| and by laying off as much traffic as possible to surrounding
| peers. [1] But it's far from production-ready.
|
| There's also a frightening amount of CSS features that can be
| used to track users very easily. @supports, @media, and a
| combination of <link media=""> and "srcset" attributes in a
| quick prototype was enough to track every client with around
| 98.3% accuracy, and I decided to not release the
| fingerprint.css project due to concerns how it might be
| abused in the wild.
|
| Especially with unicode behaviour inside the CSS files
| themselves. CSS ident-tokens [2] are specified as "non-ASCII"
| so they can be emojis, too. And those have varying support
| across all Browser versions due to the ICU library being
| embedded in them (and being absolutely unique in every single
| subminor release I've tested so far).
|
| [1] https://github.com/tholian-network/stealth
|
| [2] https://www.w3.org/TR/css-syntax-3/#tokenization
| sitkack wrote:
| That is a great question and no, mixing in noise makes it
| harder but doesn't make it impossible.
|
| Mixing in noise feels like a solution because it is hard for
| a layperson to see how a signal can be extracted, proof by
| "difficulty to me". If you mix in noise that changes averages
| then you are removing signal. If you add in random noise,
| each individual measurement deviates, but the limit of the
| average will be the same value pre-mixing.
| k_ wrote:
| It might just be some easy to ignore noise.
|
| Worst thing is, unless this is used by a very large chunk of
| the population, it would even be another tool to identify
| you.
| whiw wrote:
| > it would even be another tool to identify you
|
| How exactly? It seems like a difficult problem for a
| website to me.
| k_ wrote:
| They can use "headers include random things" as a filter
| like they do with other headers presence/absence/value
| already. I don't think defining "random things" would be
| too hard for them.
| whiw wrote:
| 1. Noise that looks like valid signal is the most
| difficult kind to remove.
|
| 2. They would have to track the noise over page requests
| to know that it was noise. The saving of and correlation
| of the saved state would be a pain.
| jvzr wrote:
| Random would be even more unique. The solution (to this
| particular problem) is for a majority of users to set the
| exact same headers, regardless of the reality
|
| Some browser vendors have started to remove the specific
| version from the User Agent string, for instance. Tor browser
| window is an actual square specifically to make that value
| (browser width & height) the same across all its users and
| improve their privacy (by making that value useless in
| finding uniqueness)
|
| Hope that makes sense, sorry if I mis-explained some things
| quicklime wrote:
| Yes, random values would be more unique, but if those
| values are changed on each request, wouldn't that make me
| harder to track?
|
| For example suppose my user agent string (and canvas
| fingerprint, accept header, etc) was different on every
| request. Would this be enough to stop ad networks from
| correlating each of my very-unique requests, and prevent
| them from tracking me across different pages?
| mywittyname wrote:
| This has been known for years though. There have been a few
| sites out there over the years which would tell you how
| identify-able you are based entirely on what headers your
| browser sends.
| Triv888 wrote:
| Maybe Google was already using these methods but did make it
| public until cookies got a bad enough reputation.
| b3kart wrote:
| Interesting: could you point me to one of those sites?
| geektips wrote:
| https://coveryourtracks.eff.org
| josefx wrote:
| At least for me WebGL seems to be privacy cancer. Can't
| say I am surprised.
| dang wrote:
| Recent and related: https://news.ycombinator.com/item?id=25813601
| flerchin wrote:
| It's not clear to me why we need third-party cookies to be a
| technology that browsers support. Just axe them. No replacement.
| t0mas88 wrote:
| Because Google makes billions on selling ads on other sites
| through their DV360 / Google AdX products. Those ads need
| third-party cookies for targeting, otherwise the price drops by
| a factor 5.
|
| Google also happens to make the browser with by far the largest
| market share. So they're not going to axe third-party cookies
| as long as it drops their revenue 5x.
| grishka wrote:
| Yeah. I don't understand why it's taking so much time and
| effort and debate when all it would really take is literally a
| single line of code to change the default value of the setting
| that blocks or allows third-party cookies. I'd bet most users
| won't even notice the difference.
|
| The original RFC that introduced cookies specifically said that
| third-party cookies aren't permitted. Then Netscape broke it.
| Then everyone else did. It's about time browsers become spec-
| compliant.
| vorticalbox wrote:
| Is this not how brave ads work, with a local profile that fetches
| ads you profiles says you should be interested in?
| [deleted]
| PedroBatista wrote:
| Just the same stairs but now with only one giant step and Google
| is the one with giant legs.
|
| Also there's no "privacy-friendly" tracking technology, it's an
| oxymoron an slick marketing/corporate strategy ( that works ).
| alisonkisk wrote:
| Read this explainer: https://github.com/WICG/floc
| chopin24 wrote:
| I'm happy that HN has accepted a change in the article's title,
| which appears as "Google says it may have found a privacy-
| friendly substitute to cookies." This terminology -- "found" --
| has been used by Google and others to imply that their capture of
| behavioral "exhaust" is somehow a natural phenomenon, rather than
| a conscious, deliberate, profit-driven choice. Google didn't
| "find" a substitute. They are _developing it_ because they are
| getting pushback from users and companies who object to their
| tracking methods and they 're desperate to find something that
| convinces users they've Really Changed This Time.
|
| Don't fall for it. Break up with Google. They are abusive.
| crazygringo wrote:
| I'm very attentive to wording but I think you're reading too
| much into this one.
|
| In programming and in business you talking about "finding" a
| solution to a problem all the time. In this case, the problem
| is how to improve privacy without advertising revenue dropping
| off a cliff. And it's not like the solution is staring you in
| the face -- it takes iteration and testing for it to be
| "found".
|
| So I don't think Google is being disingenuous here. Nothing is
| being implied as a somehow natural phenomenon. Business in
| general is about "finding" satisfactory solutions to problems
| day in and day out. "Find" and "develop" are essentially
| synonymous and interchangeable here.
| chopin24 wrote:
| I'm not comfortable assuming the best intentions from a $1.2T
| company whose business model relies on tracking my behavior.
| They burned that bridge after the war-driving "Wi-Spy"
| scandal, which went on from 2007-2010. [0].
|
| It's clear that Google sees a threat to their business model,
| and they'll use any PR-friendly language they can to convince
| people that they're addressing the user concerns. Just like
| they did in 2010, when they ascribed their willful
| malfeasance to a "rogue engineer" who they then put in charge
| of StreetView.
|
| If I'm "reading too much into it" it's because we
| collectively haven't been reading enough into it for the past
| 15 or so years, and in that time our Overton window has
| shifted too far.
|
| [0]https://www.theregister.com/2019/07/23/google_wispy_payout
| /
| buttersbrian wrote:
| You, myself, everyone -- we're jaded. Got it.
|
| Not much concerned with descriptions, so what's your
| prescription:
|
| What do you believe to be satisfactory wording in this
| case?
| spinningslate wrote:
| this. I'm fine with GP's use of "finding" a solution. Yes,
| they found one; finding solutions is a central process in
| software.
|
| But I'm absolutely not fine with the wording. When
| confronted with the phrase "Google says it may have found a
| privacy-friendly substitute to cookies", I'd wager most
| people would think "this improves my privacy in a general
| way".
|
| It doesn't. What it actually means is: Google will make it
| more difficult for others to track you, but Google will
| remain committed to tracking everything about you that it
| possibly can. Except silently, and without your ability to
| opt out using ad blockers etc.
|
| Net result: even less control over your privacy, and Google
| further entrenches its monopoly to boot.
|
| That's not an improvement in privacy.
| Hasu wrote:
| > That's not an improvement in privacy.
|
| Okay, that's obviously not true from your own comment,
| you also say:
|
| > Google will make it more difficult for others to track
| you
|
| So if Google can still track me at the same level, but
| others can no longer track me, that IS an improvement in
| privacy. Not from Google, but from everyone else.
|
| I get and agree with your point about how it's an
| advantage for Google because they can make their tracking
| harder to avoid and lock others out, but I find it hard
| to sell "This is not an improvement in privacy" as being
| unequivocally true.
| RcrdBrt wrote:
| Google will track you more since adblockers will do
| nothing against this new practice.
|
| So, again, this is not an improvement for users privacy.
| Hasu wrote:
| This does not address my point at all, which is that if
| it locks out OTHER parties that currently abuse third
| party cookies from tracking, there's a substantial
| improvement for user privacy from those parties.
|
| Not to mention that while adblockers currently do not do
| anything against this practice, this does not mean that
| adblockers can never come up with anything to block this
| tracking.
| jand wrote:
| > So if Google can still track me at the same level, but
| others can no longer track me, that IS an improvement in
| privacy.
|
| This is only true if Google does not sell the gathered
| information to others. If they sell the data, the net
| privacy gain is nil.
| Hasu wrote:
| Google doesn't sell your personal information to anyone.
| They sell ads that are targeted based on your personal
| information. Selling your information directly would
| entirely negate their market advantage in advertising, it
| would be suicide for Google. Facebook has been known to
| do this, Google has not.
| chopin24 wrote:
| Not saying you are one such person, or that you are doing
| so intentionally, but this argument is a clever sleight
| of hand employed by surveillance capitalists and their
| apologists to deflect attention away from the real issue:
| that thousands of well-paid, highly intelligent engineers
| devote 40+ hours a week to coming up with ways to
| influence your behavior.
|
| "Selling personal data" -- as if your particular affinity
| for left handed baseball gloves were of special interest
| to large corporations -- is a red herring. Let's stop
| perpetuating it.
| ceph_ wrote:
| But there are companies that sell personal data. Google
| is not one of them. The phone companies sell your
| location. There are regular articles about companies
| buying up chrome extensions to harvest/sell browsing
| data. Etc
| subaquamille wrote:
| OP also says > but Google will remain committed to
| tracking everything about you that it possibly can.
| Except silently, and without your ability to opt out
| using ad blockers etc.
|
| Also others may no longer track you at the moment but
| they definitely _will do_ in the future.
|
| So... Google breached a bit deeper in our privacy and
| paved the way for others to follow them. I can't help
| seeing them so evil.
| btilly wrote:
| You are reading way too much into that incident.
|
| At the time, "Engineer put in feature not asked for."
|
| Later, "Upon full examination, engineer put description of
| feature in piece of paper shoved in front of busy manager,
| and told selected co-workers what he had done." (None of
| whom, when the shit hit the fan, should be expected to
| stick their necks out.)
|
| Neither version suggests that the feature was something
| reflective of corporate policy, or would have had support
| from higher ups if they knew about it. Also, said engineer
| turns out to be a very good programmer. Which explains the
| company's decision to try to keep him and correct his
| behavior rather than immediately firing him.
| chopin24 wrote:
| That Google's management is so unaware of their data
| collection software that they allowed engineers to drive
| around spying on people for three years does not inspire
| trust. Incompetence or malice is besides the point.
| Google did not take responsibility for the error, and in
| fact stonewalled Congress for more than a year when it
| was investigated.
| btilly wrote:
| What do you expect them to do? Randomly sample the
| internal data formats? Or verify that the official
| outputs look right?
|
| Google specializes in automation at scale, not lovingly
| handcrafted data.
| titzer wrote:
| > ...without advertising revenue dropping off a cliff.
|
| Ah, the 'ole "right to my business model" attitude. Not
| blaming you in particular for this, but it's pervasive.
| Needless to say, I thoroughly and utterly disagree than any
| business has a right to _any_ business model, particularly
| one that robs me of my time, attention, and computational
| resources.
| xtiansimon wrote:
| Haha! Yes to both. Replies!
| frogpelt wrote:
| You said "they are desperate to find something".
|
| They said they "found" something. I see similarities.
| local_dev wrote:
| Agreed. When I saw this title and read the article, the first
| thing I thought was "Ok, how to I block this/opt-out". It
| sounds like this is only in Chrome though, for now. One hopes
| that FF will continue to be privacy focused and not add
| anything like this.
| suddenexample wrote:
| Company: Supports measures that lead to additional privacy
| because they realize how important it is to their users
|
| HN: Don't be tricked, Google is evil.
|
| Like I understand WHY there's a hate boner for Google. I just
| don't understand why people think it's bad for them to
| acknowledge the preferences of their users and make decisions
| accordingly.
| ogre_codes wrote:
| > I just don't understand why people think it's bad for them
| to acknowledge the preferences of their users and make
| decisions accordingly.
|
| Aside from that. Why on earth should we _trust_ Google will
| limit their collection to this one method? They 've lied over
| and over about what they collect. Been caught multiple times
| breaking laws to collect information only to say "Oops, it
| was a rough engineer". They've been caught bypassing the no-
| tracking flag in browsers. They've been caught abusing
| location data after users disabled it.
|
| As the saying goes: Burn me once, shame on you. Burn me 750
| times, why in the fuck am I still using Google?
| ocdtrekkie wrote:
| Because the preferences of their users would be _to stop
| tracking their users_. You don 't need a replacement, you
| just need to get rid of it.
|
| There's a significant amount of gaslighting to claiming that
| you value user privacy whilst developing new ways to track
| users.
| buttersbrian wrote:
| Are you sure? I think there's lots of consumers that find
| the tradeoff okay - in fact I hear people state as much all
| the time. Maybe they don't know the extent of the tracking,
| or they don't care, but the opinion exists and it's not
| negligible.
|
| If companies could anonymously track users, and still
| maintain the marketing backbone of the internet I think
| most people would be fine with it -- in fact, prefer it.
| ogre_codes wrote:
| > Are you sure?
|
| If users are given the option in clear terms, most users
| will turn off tracking. Facebook knows this, it's why
| they are so pissed off at Apple and have taken such an
| aggressive public stance against Apple. Google knows it,
| it's why they haven't published an update to any of the
| iPhone apps since Apple started requiring their apps
| report what end user data they collect.
|
| Google and Facebook are sure... not sure anyone else is
| more qualified on this.
|
| > If companies could anonymously track users, and still
| maintain the marketing backbone of the internet I think
| most people would be fine with it -- in fact, prefer it.
|
| If this were true, why doesn't Google, Facebook, and
| others give us straight-forward ways to opt out? If
| people would prefer it, why exactly is Facebook trying so
| damned hard to prevent Apple from giving people a simple
| opt out?
| buttersbrian wrote:
| > If this were true, why doesn't Google, Facebook, and
| others give us straight-forward ways to opt out?
|
| I don't think anyone expected them to just flip the
| switch and do that without a reasonable (maybe to just
| them?) alternative I can say, the idea of using 'cohorts'
| as discussed by this FLOC approach, from what I can tell,
| is positive progress. Is it far enough? Perhaps not.
|
| > why exactly is Facebook trying so damned hard to
| prevent Apple from giving people a simple opt out?
|
| Good question. I am not aware of that issue.
|
| I also Question Apple as they take payment from Google to
| the tune of billions of dollars for search, pushing
| 'beacons' etc, while promoting themselves a bastion of
| privacy and security. None of it is as simple as it
| seems.
| ogre_codes wrote:
| > I don't think anyone expected them to just flip the
| switch and do that without a reasonable (maybe to just
| them?)
|
| No more than anyone expects a heroin addict to stop cold
| turkey. The problem is Google isn't stopping or giving
| people the option to opt out, they are just changing
| tactics slightly.
|
| This also doesn't really talk about how this data gets
| integrated into the rest of the profile Google has built
| and will continue to build on users (without their
| permission) based on their search history, mapping,
| email, etc.
|
| Google isn't
| danShumway wrote:
| Would they prefer it enough to opt into that tracking and
| targeting?
|
| If users had to go to a setting to turn on targeted ads,
| what percentage of them do you think would do it? I
| suspect it would be pretty low. I wonder if most people
| would even notice that the setting had been turned off?
|
| We use the opt out model all the time to justify why
| users don't actually care about tracking -- we say that
| they'd opt out if they did care. But I feel like we all
| mostly know that an opt in system would also not see much
| use (that's the reason why ad networks are so opposed to
| them), and I don't know why we don't consider that to be
| evidence that consumers probably don't value targeted ads
| very much at all.
| buttersbrian wrote:
| > Would they prefer it enough to opt into that tracking
| and targeting?
|
| I believe a a sizable portion would. They like the
| targeted offers and ads. Maybe because they enjoy the
| feel of something being catered to them, maybe because
| they are addicted to shopping/consumerists. IDK.
|
| > If users had to go to a setting to turn on targeted
| ads, what percentage of them do you think would do it? I
| suspect it would be pretty low. I wonder if most people
| would even notice that the setting had been turned off?
|
| I think this is a really good question. The power of opt-
| in vs. opt-out, as you noted.
|
| However, I don't know if we can conclude whether they
| value it or not solely from their willingness to opt-in.
| We really have to account for how the ability to opt-in
| is exposed. If we showed it on every size (akin to the
| cookie accept craze of today), we'd see a lot of people
| opt-in. If it were hidden in a chrome settings, far less
| just because that's mentally off limits for many, and
| easily forgettable.
|
| I totally agree with you on somewhat sinister motivation
| of opt-out over opt-out patterns.
| josefx wrote:
| > I think there's lots of consumers that find the
| tradeoff okay
|
| How often do consumers even get asked? My webmail
| provider seems to have no issues providing both paid and
| ad supported. Other services just pulled the paid plan
| from under my feet. Whats App with its new terms and
| conditions once had a small yearly fee, Facebook dropped
| it. User choice? certainly not mine.
|
| > If companies could anonymously track users
|
| That is like trying to identify a suspect using a smiley
| face. If they track you it isn't anonymous.
|
| > and still maintain the marketing backbone of the
| internet I think most people would be fine with it
|
| Why do we need targeted ads? Websites usually have topics
| they are focused on, is it wrong to show car ads on a
| page for car enthusiasts? On a news story showing a newly
| released car?
| buttersbrian wrote:
| > How often do consumers even get asked?
|
| Purely anecdotal that I am drawing from -- I've had this
| discussion with quite a few non-tech folks over the last
| few years privacy/tracking has hit the zeitgeist.
|
| Many dismissively state something like, "I know. Don't
| care. Means stuffs free right?", or "I'm not doing
| anything wrong, I don't care".
|
| > That is like trying to identify a suspect using a
| smiley face. If they track you it isn't anonymous
|
| By that I mean regulations around what they track,
| identifiable data, not being able to explicitly say
| User2021 === Josefx on the system. I think this is why
| Google is going with the 'cohorts' in their FloC
| approach.
|
| > Why do we need targeted ads?
|
| Good question. "Need", probably not. But if I am on
| facebook, and ads are going to happen, do I want highest
| bidder ads like "Find Hot Milks in your Area Now"
| interspersed between my feed's family baby photos or an
| add for "World's best Uncle" T-Shirts? There's a happy
| medium somewhere.
| PoignardAzur wrote:
| Don't you need at least _some_ tracking to avoid fraudulent
| clicks?
|
| I'm as pro-privacy as they come, but until someone comes
| with the incentive or dedication to build an alternate
| payment ecosystem out of nowhere, ads _are_ what the web is
| built on.
| hhjj wrote:
| I may begin to trust Google when they provide a single switch
| to disable ALL tracking on android phone (or better disable
| all of them by default). That's what my preference is and it
| isn't respected.
| bmcahren wrote:
| If you read into the federated technology they've been
| deploying I'm fairly comfortable saying I agree with their
| decisions.
|
| Let's take a look at the "Now Playing" architecture available
| on Pixel devices.
|
| At first glance by a critic you think "You're crazy for giving
| Google permission to have your microphone always on and
| listening for songs you're hearing, privacy this privacy that".
|
| If you read into it, you'll be comforted to know they've built
| a model to generate signatures clientside which are able to be
| compared on-device to a list of signatures which are similar to
| it. Then as far as I understand, they are able to take
| signatures which contain no discernable audio data and use
| those to discover new audio trends.
|
| > On Pixel 4 and later phones, the counts of songs recognized
| are aggregated using a privacy-preserving technology called
| federated analytics. This will be used to improve Now Playing's
| song database so it will recognize what's playing more often.
| Google can never see what songs you listen to, just the most
| popular songs in different regions.
|
| Privacy-preserving, user-beneficial, and useful for advertising
| targeting if you haven't opted out of interest based ads.
| jrochkind1 wrote:
| As far as "now playing",what they say the software does may
| be quite clever in privacy preserving ways.
|
| But once you have given them access to your microphone, you
| have to trust that their software does what they say it does,
| without mistakes or bugs (whether in design or
| implementation) or accidental security vulnerabilities
| (possibly maliciously introduced by the NSA or who knows).
|
| If you do not give them access to your microphone (assuming
| the OS access controls are themselves working; but that's a
| much smaller attack area), you do not need to understand
| trust anything.
| pseudosavant wrote:
| And this is from a company that forgot to tell people that
| the flashy smart thermostat they bought last year has had a
| mic in it the whole time.
|
| They are a company that will only pay attention to privacy
| when forced to by an existential threat. It just isn't in
| their company DNA to care about user privacy. They aren't
| the customers.
| hertzrat wrote:
| What do you mean a microphone in a thermostat? You mean
| nest?
| deadmutex wrote:
| I know Ecobee Thermostat has a mic built in... but not
| Nest Thermostat.
|
| So, OP is likely mistaken in their comment.
| deadmutex wrote:
| > that the flashy smart thermostat they bought last year
| has had a mic in it the whole time.
|
| Note to readers: This is false.
|
| EDIT: If you're downvoting, please provide evidence.
| There is a lot of misinformation out there, and OPs post
| increases it.
|
| EDIT #2: Here is Rishi Chandra, GM of Nest: "Putting a
| microphone on a thermostat, I actually don't think makes
| any sense"
| tempest_ wrote:
| https://www.cnet.com/news/google-calls-nests-hidden-
| micropho...
|
| It was in their security hub which is perhaps better or
| worse than the thermostat depending on your view.
| deadmutex wrote:
| My understanding is that security hub announced glass
| break detection from day 1. And that feature uses a
| microphone to listen to glass breaks... so I wasn't
| surprised. But, I guess that's not obvious to everyone,
| so they could've put it on the box.
|
| And, I just didn't HN readers to think there was a mic on
| the thermostat, so I was correcting that.
| frenchy wrote:
| Having a microphone to detect broken glass is very much
| not obvious. As someone completely unfamiliar with the
| problem space, I would have assumed the normal
| solsolution was something along the lines of: run a
| current throught the glass and check the voltage "drop".
| deadmutex wrote:
| I could see this if you connected the device to the
| system that monitored this. But since you never do that
| in the install process, I am not sure why you'd assume
| that's how the system would work.
| PhantomGremlin wrote:
| _you have to trust_
|
| Remember when Google sent hundreds if not thousands of cars
| all around the world and 'accidentally' hoovered up massive
| amounts of information?
|
| I'm sure it was all an innocent mistake. Google are
| certainly worthy of our trust! /s
|
| https://www.theguardian.com/technology/2010/may/15/google-
| ad...
| danShumway wrote:
| > If you read into the federated technology they've been
| deploying I'm fairly comfortable saying I agree with their
| decisions.
|
| I don't, not in this case. The only thing that FLoC seems to
| change is how data is aggregated and how buckets are
| determined. But fundamentally, the idea of taking users,
| putting them into a box based on their normal browsing habits
| behind the scenes, and then broadcasting that box and
| associated data to every website they visit -- that's just
| not a private model.
|
| What Google doesn't seem to understand (or chooses not to
| understand) is that the end result of bucketing users and
| sharing data about them behind the scenes while they browse
| _is the part_ that many people object to. So Google keeps on
| trying to come up with systems that allow them to serve
| different content to people and to collect demographic info
| based on variables and processes outside of users ' control
| -- but to somehow do it in a way that is magically not a
| problem.
|
| But it's like trying to create a 'nice' mugging. It's not
| just the methods I'm opposed to, it's also the end goal.
|
| FLoC still doesn't give users control over how they present
| themselves on the web. And part of privacy -- part of the
| reason I care about privacy in the first place -- is because
| people should have control over how they present themselves
| on the web. There are tools Google could build if they wanted
| to go in that direction, but FLoC remains an opaque system
| that runs in the background that collects data about you and
| sends it to every website that you visit. That's not a
| private system, regardless of how the data is collected. It's
| not designed to be transparent, it's not designed around user
| consent.
|
| Honestly, it shouldn't even be an opt-in/opt-out system. Why
| can't I choose what buckets I belong to? Google isn't
| thinking deeply about user choice, they're not even being
| remotely imaginative about how they could give users more
| power over what ads are shown to them. They're still stuck in
| a mindset of "this needs to happen behind the scenes outside
| of your control where you don't know what we think about you.
| And we'll let you opt out of the entire system purely because
| we're forced to. But nothing else!"
| Kronopath wrote:
| This is actually a really good point. A lot of privacy-
| related things people complain about are actually related
| to how you present yourself, how your identity is seen by
| the computer system you're interacting with.
|
| That's been on my mind a lot lately, so much that I wrote a
| thing about it: https://kronopath.net/blog/segmented-
| identity-as-necessary-f...
| Justsignedup wrote:
| This is a great point. What is "found"? Like the point is "I
| don't want 3rd party websites to be AT ALL AWARE that I visited
| another site". How can you find an alternative? I don't want an
| alternative, I want it gone.
|
| If google does this, and other browsers don't, it would mean
| that just using another browser like Firefox will effectively
| hide you. Hoping for that.
| topspin wrote:
| This is what EFF says about this scheme:
|
| "A flock name would essentially be a behavioral credit score: a
| tattoo on your digital forehead that gives a succinct summary of
| who you are, what you like, where you go, what you buy, and with
| whom you associate."
|
| https://www.eff.org/deeplinks/2019/08/dont-play-googles-priv...
|
| BTW, Chrome users have been part of this system for nearly a year
| now.
| danlugo92 wrote:
| https://adssettings.google.com/
| jrochkind1 wrote:
| > BTW, Chrome users have been part of this system for nearly a
| year now.
|
| The OP says "The API exists as a browser extension within
| Google Chrome," which made me think it was a separate browser
| extension that I would not have if I had not chosen to install.
|
| Are you saying it's built into Chrome instead? Cite? And if so
| is there any way to disable it?
| qwertox wrote:
| This sounds like the YouTube-algorithm applied to the
| advertising web.
|
| One false click and you're damned for weeks.
| spideymans wrote:
| I must not be the only one who watches YouTube in private
| browsing mode, so one wrong video won't totally ruin your
| recommends
| wutbrodo wrote:
| Just click Not Interested a couple times and the faulty
| recommendations go away, right?
| fartcannon wrote:
| You can use YouTube logged out. I have an RSS reader for all
| my favourite channels.
| numpad0 wrote:
| You can use the web logged out but you can't escape Google
| trying to convince me I'm from India.
| kevincox wrote:
| Although the RSS feed is very basic. No description or
| thumbnails. Also premiers show up before the are available
| and other inconveniences.
| recursive wrote:
| But you can't use Youtube premium logged out.
| TameAntelope wrote:
| EFF tends to... sensationalize things more than I'm comfortable
| with.
| jjcon wrote:
| I wonder why that is cause I feel like they didn't used to be
| as hyperbolic or dramatic
| jascii wrote:
| I suspect part of that might be a reaction to us (as in the
| population in general) getting used to privacy violations.
| It takes a bit more drama to get our attention in a world
| where we are all willing to voluntarily carry a personal
| tracking device 24/7...
| notatoad wrote:
| EFF has been taken over by privacy zealots. They used to be
| more focused on what their name says: freedom. That is,
| fighting censorship and regulation of the internet.
|
| And the privacy folks love their hyperbole
| throwaway287391 wrote:
| > what their name says: freedom.
|
| Not necessarily disagreeing with your actual point, but
| neither of the F's in EFF stands for "freedom".
| cpmsmith wrote:
| It's hardly a new idea to count privacy as part of
| freedom.
| notatoad wrote:
| Yeah, but privacy used to be a part of the mission, with
| the understanding that privacy was a good thing but not
| _the only thing_. It seems like over the years the focus
| has shifted to privacy being the ultimate goal, and every
| other aspect of the mission happens in the name of
| privacy. And there 's no acceptance of the idea that some
| well-informed people are willing to make a decision to
| sacrifice some privacy in exchange for convenience or
| cost, anybody who considers sacrificing some of their own
| privacy is treated as a moron who needs to be protected
| from themselves.
| cpmsmith wrote:
| They've definitely zeroed in on it. I think it is
| understandable, though, from their position--privacy
| seems most acutely at-risk, at least stateside, and the
| vast majority of people affected aren't among those well-
| informed.
| TameAntelope wrote:
| I've always chalked it up to fundraising strategy, but I
| admit my opinion is reactive, I haven't done any research.
| syshum wrote:
| EFF tends to be far more timid than what I am comfortable
| with.
| protomyth wrote:
| The EFF needs to get to the point that politicians
| seriously consider fighting them. Its the only real way to
| effect change in the US. As we have seen multiple times,
| the other folks keep introducing bills over and over again.
| That needs to be a hard stop and the only way to get there
| is fear because logic just doesn't work for tech.
| dimitrios1 wrote:
| And the only way to get politicians to seriously consider
| fighting for you is:
|
| - give them money
|
| - offer them a political advantage over their opponents
|
| - build up enough grassroots support among their
| constituents that not supporting your positions would be
| effectively career suicide
|
| The best organizations utilize all three.
| hyperdimension wrote:
| I completely agree with you, but...
|
| > because logic just doesn't work for tech
|
| is more than a little ironic to me. I suppose it's like
| people saying you can't 'out-logic' a judge through a
| technicality, or that the law does not mean they are
| Perfect Laws of Logic; they're designed to be interpreted
| by judges.
|
| I admire the EFF though, and I support what they advocate
| for. I was going to say I'd like them to be more
| pragmatic, but I suppose their hardline opinions are most
| of the reason they exist as an advocacy organization.
| protomyth wrote:
| Logic is fine for technical folks and a lot of general
| public, it doesn't work very well in the political arena
| where waving the bloody shirt is the norm. Logic is a
| poor weapon in an emotional debate and doesn't work worth
| a damn.
| hyperdimension wrote:
| True. I just found that quote ironic is all.
|
| To your actual point, though, I think you've totally
| nailed it. I'm not sure if emotion and politics are
| permanently inextricable, but from my limited,
| unfortunate experience, it seems as though any kind of
| logical argument doesn't or can't (!) sway _anyone_
| (myself, of course, included!)
| sitkack wrote:
| So you give the EFF what kind of TameAntelope score? 0.3 ?
| TameAntelope wrote:
| I love the EFF! I just don't like this one specific aspect
| of the articles I see written from them.
|
| It's hardly a big deal, I almost regret commenting about
| it.
| sitkack wrote:
| I give you a 0.9 sitkack score (quite good actually) for
| the honest response. :)
| hetspookjee wrote:
| Is a sitkack score a thing or am I missing out on a joke?
| pdpi wrote:
| Their usernames
| Shebanator wrote:
| I donate money to the EFF regularly, but their click-
| baity hysteria annoys me too.
| inopinatus wrote:
| Google openly choose the collective noun for sheep as the
| name of a technology for labelling humans; I don't think the
| EFF's response should be the primary source of discomfort
| here.
| azornathogron wrote:
| This made me laugh, and you're not wrong exactly, but I
| will note that "flock" is also used for _birds_ , and birds
| are frequently used as a metaphor for freedom just as much
| as sheep are used as a metaphor for mindless group-
| following behaviour.
|
| Either way, I wouldn't read too much into it.
| jefftk wrote:
| These proposals have generally been named after bird
| things:
|
| https://github.com/michaelkleber/pigin
| https://github.com/WICG/turtledove
| https://github.com/WICG/sparrow
| https://github.com/WICG/turtledove/blob/master/TERN.md
| https://github.com/prebid/identity-
| gatekeeper/blob/master/pr...
|
| I believe "FLoC" comes from the phrase "birds of a
| feather flock together", which is a decent metaphor for
| how the proposal works
|
| (Disclosure: I work for Google, speaking only for myself)
| safog wrote:
| You should really investigate your biases if the first
| thought that comes to your mind is that Google chose flock
| because it thinks customers are sheep.
| fgonzag wrote:
| Of course Google does not think its customers are sheep.
| Google thinks its users are sheep. Chrome users are not
| the customers.
| inopinatus wrote:
| My first thought was birds, so perhaps I might turn that
| around and recommend a reciprocal investigation. Because
| only on further reflection did I see the darker side, and
| realise the likes of Google do not get to make the "oh-
| we-didn't-realise-that" argument.
| warkdarrior wrote:
| The nice thing about this new Google technology is that
| it makes it easier to fleece a flock of users, while
| making the users think they have privacy.
| dehrmann wrote:
| They're not quite as aggressive as Greenpeace, but they have
| the same mainstream credibility problem.
| [deleted]
| Hnsuz wrote:
| Whole Google should unfind them self
| qwerty456127 wrote:
| I don't believe Google because it could have eliminated all the
| ways to track people without their consent long ago if it wanted.
| Almost everybody uses Chrome/Blink and agrees to everything they
| decide. They can define and deprecate almost whatever browser
| APIs and behaviors they want. But it doesn't because they are the
| single biggest actor making use of these ways. E.g. it is known
| Google Captcha doesn't simply tell them you are a human, it tells
| them which particular human you are.
| voicedYoda wrote:
| Can you explain, or provide reference points for the captcha
| knowing "which particular human" i am?
| SquareWheel wrote:
| It's utterly untrue. reCaptcha gives you a score of
| "humanness", and you can decide to allow or deny an action.
| There's no way to get an ID from it.
| [deleted]
| frongpik wrote:
| sudo apt install chromium-browser
| craftinator wrote:
| "In a landmark decision, Google has decided to continue keeping
| the "Don't be evil" principal off of their Company Principles
| list"
| yalogin wrote:
| They did not find a replacement to cookies. Cookies have become
| too toxic and are harming them and so they found a way to not
| store anything on the device and yet be able to target users.
| This means it will be impossible to stop them from profiling and
| targeting users as users don't control anything.
|
| They are using privacy preserving techniques, and even if we
| assume they are doing it well, it just means that we will never
| get rid of the profiling and paying to get privacy will not
| happen with google services
| jahewson wrote:
| If you read the 4th bullet point in the article you'd see that
| the data is actually stored on the device. Users will go from
| seeing only cookies, which are opaque ids, to their full
| behavioral profiles - basically, the opposite of what you just
| said.
| gnud wrote:
| This mainly looks like a push to remove even more user control
| over tracking. The spec says that the browser can return 'random'
| data, but I suspect that Chrome won't let you do that, or at
| least not for long.
|
| Instead of the user disabling third-party cookies, every single
| page author would have to set a new HTTP header to have their
| page excluded from the machine learning.
|
| It also looks like a massive GDPR pitfall. Say you track
| conversions to a campaign, and track what "cohorts" a user
| entered your sales pipeline through. The moment you connect this
| data to the customer, it's personal data IMHO. If you operate in
| Europe, the user should be able to retrieve/delete the data, and
| request it changed if they say it's wrong.
| nousermane wrote:
| While on the topic of third-party cookies - is there any
| legitimate use for those at all (outside of semi-covert user
| tracking)?
|
| I understand how first-party cookies are useful - you take a
| stateless protocol (HTTP) and make it aware of "sessions". And
| those in turn are a nifty block to build upon -
| login/authentication, "shopping cart", whatever...
|
| But having one website to be able to save state that is only
| accessible to a chosen different site - what's the use for that?
| acdha wrote:
| Things like single-sign on or social networking where you toss
| some JavaScript from api.example.com on your page and it does
| things like automatically log you in or display messages you
| might have.
|
| I see this is as a tragedy of the commons problem: it's kind of
| nice to have, say, a counter of unread Disqus messages but the
| relative value of that compared to the use by tracking
| companies is hard to ignore.
| tootie wrote:
| Oauth and SAML use callbacks, not cookies.
| acdha wrote:
| I didn't specify those protocols for a reason. While that's
| technically true at the API level, I was referring to
| things like the "Sign-On with <service>" widgets - you can
| make a completely static version of that which doesn't use
| cookies but there is a nice UI improvement if the button
| can load and say things like "Login as @tootie" or
| "@tootie, you have 5 DMs" anywhere you see it.
|
| Instead, I think we're going to recognize that this is too
| broad to be secured and either come up with ways to scope
| it down (e.g. requiring the third-party to have some sort
| of opt-in prompt) or that entire market category replaced
| with browser-controlled alternatives, which isn't great for
| companies other than Apple, Google, and maybe Microsoft but
| does have the appeal of not trusting an entity which the
| user isn't already trusting.
| iooi wrote:
| That's if you're implementing SSO on your own, which most
| folks don't do. Most SaaS using SSO in their apps will use
| Auth0 or Okta. Dropping third-party cookies has
| implications for these integrations:
|
| https://support.okta.com/help/s/article/FAQ-How-Blocking-
| Thi...
| mfer wrote:
| Part of this is about the middle men. The NY Times cut off ad
| exchanges in EU and found it didn't kill their ad business [1].
| One thing it did do was cut out the middle companies doing the
| brokering. Google has made A LOT of money just being a middle
| company. Like a car dealership.
|
| The proposed system deals with large groups and machine learning.
| It requires a browser ad on or changes to the browser. This is
| not approachable for startups, small businesses, or those who are
| independent. It's targeted at Google and further helps solidify
| their position.
|
| As people want to cut Google off from constantly monitoring them
| they are looking for ways to work around being cut off to keep
| the data flowing. Branding and marketing their work to make
| people want it.
|
| [1] https://digiday.com/media/gumgumtest-new-york-times-gdpr-
| cut...
| wombatpm wrote:
| >Part of this is about the middle men. The NY Times cut off ad
| exchanges in EU and found it didn't kill their ad business [1].
| One thing it did do was cut out the middle companies doing the
| brokering. Google has made A LOT of money just being a middle
| company. Like a car dealership.
|
| So NYT rediscovers publishing? Once upon a time, publishers had
| teams of sales people who had relationships with companies
| needing to advertise and coordinated theirs ads with the
| publishing schedule. Publishers then gave that advantage away
| to sell ads for a fraction of their current price on a per view
| basis and have been crying ever since.
| mfer wrote:
| Publishers have been seeing their ad revenue decrease. This
| is why they complain.
|
| Companies like Google make billions on ads. A overwhelming
| majority of Googles income is from ads. It isn't just that
| Google served new markets or that publishers outsourced the
| work.
|
| Ad systems use a bidding system. Google controls both sides
| of the bidding system. The system is setup in a way where
| Google has benefited more than others.
|
| It reminds me of record labels and producers. They make the
| lions share of the money on record sales for most albums and
| music. The artists typically get a small share. Some artists
| have walk away with a medium income will selling millions of
| albums and having the label/producers making millions.
|
| The lower income to publishers has caused them to do more
| shock and awe type articles that aren't good for us. They pay
| more inexperienced people less so there is less mentoring.
| Overall it means the quality of the published stuff has gone
| downhill.
| Medicineguy wrote:
| The problem is not the option to place cookies per se. The issue
| is its misuse which aims to de-anonymize users (in order to place
| ads). I don't see how saving the user data somewhere else (in a
| browser add-on or in the browser natively) is helping here.
|
| EDIT: The official description [https://github.com/WICG/floc],
| does a better job in explaining the point. They try to cluster
| (="cohort") users interests and exchange that with the ad-
| service. This could maybe help to increase transparency and
| authority over your data as it's saved locally. But I don't see a
| way to limit the access to the users cohorts (they even say that
| themself, see link above). Everybody could access my interests -
| not just Google and other ad services. And of course, if you have
| 1000 categories and some meta information (region based on IP
| address etc.), you will be able to track down individual users
| with pretty good accuracy.
| cestith wrote:
| Rather than giving the advertiser a list of my interests, it'd
| be nice if the advertiser gave me a list of keywords for the
| ads it might show next and my browser requests the ad for me. A
| default browser could then be configured to learn with a thumbs
| up / thumbs down / never show me again type of Bayesian
| training. Or a non-mainstream browser could request random ads.
| doytch wrote:
| But most people would never up/down the ad, which means the
| ad would be targeted more randomly, which means it wouldn't
| be as effective, which means the website/content owner
| wouldn't get as much money for displaying it.
|
| I don't think that solution works in the current environment,
| unfortunately.
| bluesign wrote:
| If I clicked the ad, thumbs up, if not thumbs down. With
| appropriate weights this can work.
|
| But.. Ad networks will never implement this, cause priority
| there :
|
| 1) ad network 2) advertiser 3) publisher 4) user
|
| This bumps user from 4th place to 1st place
| mike_d wrote:
| They would never implement it because there is no valid
| signal here. The vast majority of users would just thumbs
| down every ad they see because they believe that will
| result in less ads.
|
| Go check out the messaging around Ad Choices and how
| poorly it ended up working.
| jahewson wrote:
| > if you have 1000 categories and some meta information (region
| based on IP address etc.), you will be able to track down
| individual users with pretty good accuracy.
|
| Looking at the corresponding TURTLEDOVE proposal, it's sending
| only a handful of the known categories to any given ad network
| at any given time. Floc also claims that:
|
| > The collection of cohorts will be analyzed to ensure that
| cohorts are of sufficient size
| btown wrote:
| Browser fingerprinting is already pretty good if you can run
| arbitrary JS on a site. Add access to a FLOC, even a FLOC with
| 10k people, and you're basically at a place that's _worse_ than
| third-party cookies were, because at least third-party cookies
| could be blocked. Ad networks are already using fingerprinting
| and this will be seen as a blessing to them.
| alkonaut wrote:
| If browsers would stop some edge case extensions such as
| rendering to canvas and reading the data back, it would be
| much more difficult. Browser JS envs just expose way way too
| much entropy from the user system
| cpeterso wrote:
| Plus, if each cohort is a "group of [merely] thousands of
| people [any the worldwide internet population]", the advertiser
| could probably narrow your identity pretty well using passive
| fingerprinting of cohort(s) + IP address + Chrome version + OS
| + OS version and maybe HTTP headers for languages locale and
| time zone, though those are probably strongly correlated with
| the client IP address.
| jefftk wrote:
| Combining those would definitely be a problem.
| https://www.chromium.org/Home/chromium-privacy/privacy-
| sandb... describes removing/limiting those fingerprinting
| vectors, including IP.
|
| (Disclosure: I work for Google, speaking only for myself.)
| fumar wrote:
| > Browsers would need a way to form clusters that are both
| useful and private >The browser uses machine learning
| algorithms to develop a cohort based on the sites that an
| individual visits.
|
| How would FLoC audience targeting work in non-chrome
| browsers? DV360 users deliver ads on all browsers, no?
| jahewson wrote:
| According to the specs, the requests are made without user
| agent headers, leaving only IP address. Targeting ads based
| on IP address isn't particularly valuable to ad networks if
| they can't correlate it with anything other than the
| sandboxed cohort data.
| mike_d wrote:
| If you give me a demographic group (age, sex, income, etc)
| of a thousand people, and give me the IP address I can
| uniquely identify the individual within that group using
| outside data sources like Experian.
| jefftk wrote:
| _> and give me the IP address_
|
| The Chrome proposal is that it won't:
| https://github.com/bslassey/ip-blindness
| mike_d wrote:
| What insane ramblings is this? Every site will be forced
| to use an approved CDN? Adding forced MitM to every
| connection is the opposite of what we should be trying to
| implement.
| jefftk wrote:
| If you want to prevent fingerprinting, you need to look
| at where the identifying bits are coming from. (ex:
| https://coveryourtracks.eff.org/) The IP address provides
| enough bits to uniquely identify many users, and when
| combined with just a few more bits, to identify almost
| anyone.
|
| TOR is one solution here, which you could potentially
| also describe as "adding forced MitM to every
| connection". The proposals in
| https://github.com/bslassey/ip-
| blindness/blob/master/near_pa... and
| https://github.com/bslassey/ip-
| blindness/blob/master/willful... have different tradeoffs
| than TOR, with the "TOR is painfully slow" problem being
| a big one.
|
| If you have better ideas, though, I would be very
| interested in reading them!
| 74B5 wrote:
| From the Github page: >Browsers would need a way to form
| clusters that are both useful and private >The browser uses
| machine learning algorithms to develop a cohort based on the
| sites that an individual visits.
|
| To me it sound like just another layer of indirection with
| google right in the center of it. Even if this method works
| well enough from an advertising perspective, i expect there
| will soon be adverserial models to deanonymize.
___________________________________________________________________
(page generated 2021-01-25 23:01 UTC)