[HN Gopher] LinkedIn does not use European users' data for train...
___________________________________________________________________
LinkedIn does not use European users' data for training its AI
Author : robertclaus
Score : 26 points
Date : 2024-09-22 21:08 UTC (1 hours ago)
(HTM) web link (www.techradar.com)
(TXT) w3m dump (www.techradar.com)
| JSDevOps wrote:
| How fucked In the head do you have to be to train ANY AI on
| LinkedIn Data.
| bqmjjx0kac wrote:
| I think it depends on what your objective is. Like if you want
| to simulate the LinkedIn experience, it's natural to train on
| LinkedIn data.
| artursapek wrote:
| maybe someone needs an AI that generates smug self-
| congratulatory word salad
| yarg wrote:
| No, but the ability to probabilistically detect it could be
| used to build useful filtering and prioritisation functions.
| yarg wrote:
| It could be used for all sorts of things: You
| could use it to indicate the fit between employees and
| companies; You could use it to detect lies and
| exaggerations in CVs; You could use it to estimate when
| employees are likely to be considering seeking new
| opportunities.
|
| There's significant potential for business relevant
| applications.
| bastawhiz wrote:
| I'm looking forward to getting fired because my employer
| thinks I'm considering new opportunities because they used
| LinkedIn AI
| chrsw wrote:
| Don't worry, people are hard at work making sure these AI
| systems are "aligned".
| yarg wrote:
| That's cool, but when decent companies think that
| legitimately useful people are considering resigning they
| often don't fire but instead offer a raise.
| mystified5016 wrote:
| Lol
| playingalong wrote:
| Well, not a general purpose one, but...
|
| knowing LI doesn't have any NSFW content, is full of marketing
| content (both corporate marketing and self boasting
| individuals), tends to prefer positive signals (all projects
| succeed, etc.), is mostly done in English, and so on...
|
| ... There is clearly a market for text generation in this
| language bubble. Think of all the internal and external
| communication in your $BigCorp. That has immediate use not only
| in marketing, but also HR, recruitment, company policing, etc.
| Your next company town hall can be prepared and led (!) by this
| thing.
| bastawhiz wrote:
| Half of LinkedIn posts are already written by chatgpt, you
| don't need a model trained on the output of another model
| gedy wrote:
| Training is one thing, but that feature to have AI help you
| with your LI posts seems psychotic. Generic generated slop for
| what exactly.
| amarcheschi wrote:
| Yup, I noticed this a few days ago in some subreddit like
| /r/assholedesign, I think a few months ago we had a similar
| feature on instagram and perhaps fb, I don't know if it's still
| active in EU on those meta products
| slowmovintarget wrote:
| Which implies that elsewhere...
|
| There are some consumer protections that I really do wish we
| imported into the U.S., especially food safety and chemical
| usage. Too much regulatory capture for that, though.
| zaptrem wrote:
| Why do you care if someone trains an AI on content you have
| chosen to post publicly (LinkedIn profile/posts)? I'd
| understand if it was your DMs or something but this stuff is no
| secret.
| mystified5016 wrote:
| Yeah, everyone should be allowed to cut down trees on public
| property for firewood, or dump their trash in public parks!
|
| If they didn't want these resources to be exploited, they
| shouldn't make them publicly accessible!
| zaptrem wrote:
| Cutting down a tree implies the tree is no longer there and
| cannot be used by others. In this case, the content is
| still there, unaltered.
| hn_throwaway_99 wrote:
| This is probably the worst analogy I've read this year.
| ManBeardPc wrote:
| Posting images, articles and other content doesn't grant
| everyone the right to use it for every purpose. Especially
| not to republishing it partially under the excuse a machine
| is doing it. It's just not the same as someone getting
| inspired by it or citing it.
|
| Automatically doing something is a whole other quality from a
| person doing it. Police watching a protest is fine, police
| filming it or documenting all participants via face
| recognition is forbidden (at least here).
| omoikane wrote:
| > elsewhere
|
| In other places, LinkedIn silently opted the users into AI
| training. From few days ago:
|
| https://news.ycombinator.com/item?id=41582951 - LinkedIn
| scraped user data for training before updating its terms of
| service
|
| https://news.ycombinator.com/item?id=41584929 - LinkedIn
| silently opts users into generative AI data scraping by default
| ilrwbwrkhv wrote:
| LinkedIn is the bottom of the barrel of the labour pool. Wonder
| why even train the data with any of them.
| Kim_Bruning wrote:
| Of course this then later leads to: "Linkedin AI has non-European
| bias"
|
| I'm of two minds.
| cyanydeez wrote:
| Imagine an AI that's only allowed to mine the data of people to
| stupid to elect representatives that protect their privacy.
| yazzku wrote:
| Fuck LinkedIn. They should have already been sued for their
| exploitation of people's identities a long time ago.
| bastard_op wrote:
| Just imagine if we had the same privacy protections as the EU in
| the US.
___________________________________________________________________
(page generated 2024-09-22 23:00 UTC)