Post AXtQ47FKw4RThQd7bc by NovemberMan@mastodon.sdf.org
(DIR) More posts by NovemberMan@mastodon.sdf.org
(DIR) Post #AXrbsiekpZU1NovnIe by meredithw@wandering.shop
2023-07-19T15:41:53Z
2 likes, 4 repeats
An MIT student asked AI to make her headshot more ‘professional.’ It gave her lighter skin and blue eyes.“I was like, ‘Wow, does this thing think I should become white to become more professional?’” said Wang, who is Asian American.@Wolven have you seen this?https://www.bostonglobe.com/2023/07/19/business/an-mit-student-asked-ai-make-her-headshot-more-professional-it-gave-her-lighter-skin-blue-eyes/
(DIR) Post #AXrbsjTRn625v2eHJY by freemo@qoto.org
2023-07-19T20:46:13Z
0 likes, 0 repeats
@meredithw It means she now looks closer to what the "average" professional photo looks like. @Wolven
(DIR) Post #AXriCi5fFCQT91QwqW by moffintosh@berserker.town
2023-07-19T21:57:06Z
0 likes, 0 repeats
@meredithw @Wolven AI wasn't feed enough asian people with the "professional" tag during training
(DIR) Post #AXrxs4GHDQxh5nZnZg by theothersimo@mastodon.social
2023-07-20T00:52:37Z
0 likes, 0 repeats
@freemo @meredithw @Wolven and when HR asks for resumes of qualified professional applicants you think the same thing won’t happen? At best, you’ll get people whose resumes superficially look just like those of current employees, most likely it will reject every non-Anglo-Saxon name.
(DIR) Post #AXry6skqHSs3kQwAfQ by freemo@qoto.org
2023-07-20T00:55:18Z
0 likes, 0 repeats
@theothersimo Huh? What do you mean by "same thing might happen"... how would HR turn a persons picture into a new picture? I think your talking about racial discrimination, sure, and there is plenty of that, but how is that the same as what the AI did?@meredithw @Wolven
(DIR) Post #AXrz5qnMBEtNKrBYNk by theothersimo@mastodon.social
2023-07-20T01:06:20Z
0 likes, 0 repeats
@freemo @meredithw @Wolven doing with resumes what this AI did with pictures. Do I really need to explain it to you?
(DIR) Post #AXrz9pMdRVHS3VaCWW by freemo@qoto.org
2023-07-20T01:07:01Z
0 likes, 0 repeats
@theothersimo We both seem to agree on the racism element, so no need to be rude.But yes, I dont really follow, explaining would be appreciated. But you dont have to, lets keep it civil thanks.@meredithw @Wolven
(DIR) Post #AXrzWX4DdX9ry7nNIW by theothersimo@mastodon.social
2023-07-20T01:11:09Z
0 likes, 0 repeats
@freemo @meredithw @Wolven either you can see why systems that score [stereotypically white things] as objectively more professional than [stereotypically Asian things] will make discrimination worse, or you don’t. I don’t think anything I can say will help you understand.
(DIR) Post #AXrzbz60qxQWLZEa1I by freemo@qoto.org
2023-07-20T01:12:07Z
0 likes, 0 repeats
@theothersimo Actually, what you just said now explained it, thank you... Again could have done without the disgraceful attitude, but thanks anyway, I understand what you meant now.@meredithw @Wolven
(DIR) Post #AXtHUDoAGLLnzZIHOi by blogdiva@mastodon.social
2023-07-19T20:50:05Z
0 likes, 0 repeats
@meredithw @Wolven if cryptofascist libertarians, Musk-and-or-your-billionaire-loving wankers, embarrassed millionaires, billionaires-in-waiting, love-the-police-state-for-thee-not-for-me fuckwads are developing the software then that's what you get
(DIR) Post #AXtHUEYxSMmUKhBeKm by Denian@chaos.social
2023-07-19T23:43:14Z
0 likes, 0 repeats
@blogdiva @meredithw @Wolven It's not a software issue, it's a training data issue. There's a difference. Nobody actively "programs" an AI to be racist, but people rarely, if ever, remember to actively escape the demographic distribution they see every day when they look for training data. And in some cases, it's actually impossible, because genetic outliers like albinos, for example, just ARE too rare to be well-represented in training image sets.
(DIR) Post #AXtHUFald7p3Wb2bnE by blogdiva@mastodon.social
2023-07-20T00:10:19Z
0 likes, 0 repeats
@Denian @meredithw @Wolven "Nobody actively "programs" an AI to be racist"we understand that. what silver-spooned techbros don't is that it's not enough to be not-racist. you have to be ANTI-RACIST. but that's antithetical to who they are. they do insist they are superior. even their not-so-unmelanined canon fodder coming from southern hemisphere cultures metastasized by castes & caciquismos, they believe their adjacency to the unmelanined makes them our betters. and that oozes into the code
(DIR) Post #AXtHUGGb7bHbcKc0zg by Denian@chaos.social
2023-07-20T00:24:36Z
0 likes, 0 repeats
@blogdiva @meredithw @Wolven Again - "the code" has nothing to do with it. It's all in the training data. And you can attempt to balance that as much as you want, nobody's perfect. Even you would forget SOME group, or forget to get everybody from enough angles, on bad hair days, or in front of orange backgrounds... The perfect dataset just doesn't exist. And it's worse because balancing them is a VERY new type of job - everybody's still learning.
(DIR) Post #AXtHUGQsVNVS8DQE1A by Denian@chaos.social
2023-07-19T23:43:43Z
0 likes, 0 repeats
@blogdiva @meredithw @Wolven Long story short, you don't have to be an asshole to do things wrong.
(DIR) Post #AXtHUGwQc4k9i4BQC8 by blogdiva@mastodon.social
2023-07-20T00:33:54Z
0 likes, 0 repeats
@Denian @meredithw @Wolven code is a human language. coding is a creative human process. it isn't neutral in its syntaxes; even less in its ontologies. J Simon, J Klima, M Napier all are coders (java, c#, c++) and part of the 1990s-2000s netart movement. all had/have day jobs on fintech, adtech. they played with stalkerware & ai models before it was cool. find what they've said in interviews or written about the aesthetics & subjectivity of code. whitney museum had a whole show about it
(DIR) Post #AXtHUHidipJA7ajvLE by Denian@chaos.social
2023-07-20T00:42:00Z
0 likes, 0 repeats
@blogdiva And yet form and function are separate, even more so in Machine learning. Even if every variable or function name the developers used was a racist slur, "the algorithms" still wouldn't know how to differentiate skin colors. They break input data down to beautifully abstract mathematics without meaning, match inputs to outputs statistically, extrapolate, and know nothing. You can add filters before or after, but not IN the AI because it's too abstract.
(DIR) Post #AXtHUIOTDIliDKJKXg by SallyStrange@strangeobject.space
2023-07-20T10:57:42Z
0 likes, 0 repeats
@Denian @blogdiva omg no you cannot write code without encoding human biases into it. Stop pretending like that's possible. There will never be a time when a human writes code so perfect that the only possible issue is the future input
(DIR) Post #AXtHUJ1SsJxcAGYTK4 by Denian@chaos.social
2023-07-20T11:43:19Z
1 likes, 0 repeats
@SallyStrange @blogdiva I invite you to actually learn at least the basics of machine learning - there's a great free course on Udemy, for example - before saying something like that. It's mathematics in one of its purest forms, if not THE purest form. Outside of an objective function to grade how similar a generated result is to the expected result during training, everything is abstract as hell - it's NOT POSSIBLE to inject biases into it.
(DIR) Post #AXtHUKxHgpny9scA5I by Denian@chaos.social
2023-07-20T11:46:08Z
0 likes, 0 repeats
@SallyStrange @blogdiva And before you go and say that it must be the objective function then - again, please go and learn how those actually WORK. Your prejudices are helping nobody here.
(DIR) Post #AXtHUNuCik7jJCCFCS by Denian@chaos.social
2023-07-20T12:16:48Z
0 likes, 0 repeats
@SallyStrange @blogdiva Sorry for the misinformation - it's coursera, not Udemy, and they have replaced the old single course with a three part course series by the same person. I still recommend it: https://www.coursera.org/specializations/machine-learning-introduction
(DIR) Post #AXtHXSwcRUXnnznMyu by SallyStrange@strangeobject.space
2023-07-20T12:48:40Z
0 likes, 0 repeats
@Denian @blogdiva You're a real piece of work
(DIR) Post #AXtHXThlcCG4ADr1TE by Denian@chaos.social
2023-07-20T13:08:53Z
0 likes, 0 repeats
@SallyStrange @blogdiva I can live with that if it stops even one person from spouting bullshit about a topic they know nothing about.
(DIR) Post #AXtHXVuFQlQizVi1Ca by SallyStrange@strangeobject.space
2023-07-20T13:11:54Z
0 likes, 0 repeats
@Denian @blogdiva I know about humans and biases. Clearly you do not.
(DIR) Post #AXtHXWdGjNRVF8lyNM by Denian@chaos.social
2023-07-20T13:12:20Z
1 likes, 0 repeats
@SallyStrange @blogdiva Yes. You're showing A LOT of bias.
(DIR) Post #AXtHjxABunIX0E68au by Moon@shitposter.club
2023-07-20T16:09:59.045790Z
1 likes, 0 repeats
@meredithw > does this thing thinkno.
(DIR) Post #AXtHz92e2AyuBboPc8 by thatguyoverthere@shitposter.club
2023-07-20T16:12:45.890688Z
0 likes, 0 repeats
@Moon @meredithw the ai generated photo looks like someone I used to work with. weird
(DIR) Post #AXtQ47FKw4RThQd7bc by NovemberMan@mastodon.sdf.org
2023-07-19T18:31:12Z
0 likes, 0 repeats
@meredithw @Wolven this is not ok.
(DIR) Post #AXtQ48Iv0EtwypJUpM by Nesano@detroitriotcity.com
2023-07-20T17:43:17.907119Z
1 likes, 1 repeats
@NovemberMan @meredithw @Wolven It's perfectly fine and you're a faggot.
(DIR) Post #AXtRtz2gmbFnGf0i48 by mmu_man@m.g3l.org
2023-07-20T18:03:48Z
0 likes, 0 repeats
@meredithw @Wolven AI… which should also be called Mass Bias Reproduction Weapons.
(DIR) Post #AXtTLohKw7lpFPrJ5s by katrinatransfem@mastodon.social
2023-07-20T18:20:04Z
0 likes, 0 repeats
@freemo @theothersimo @meredithw @Wolven The will ask the AI if the person looks professional, and it will only say yes to white people.
(DIR) Post #AXtTkt4RodcI36chrk by freemo@qoto.org
2023-07-20T18:24:33Z
0 likes, 0 repeats
@katrinatransfem Well few things.There is a big difference between asking if someone is within the range of professionalism vs asking to make an image more professional. When asking to make an image more professional it will naturally make the image as close to the "average" image as possible, this means making it white.This isnt the same as the AI thinking white is a requirement for professionalism, as there may very well be outliers that are non-white and still considered professional.What it boils down to is when changing the image it had no idea that clothing and hairstyles (things we can change) should be prefered over eye color or skin color.. So it didnt bother to prioritizeThat said if they were to create an AI specifically designed to check professionalism one would hope it would design in the proper considerations that realize these aspects, it wouldnt be hard to do.@theothersimo @meredithw @Wolven
(DIR) Post #AXtU0jMpf1oGaUmFrE by katrinatransfem@mastodon.social
2023-07-20T18:27:29Z
0 likes, 0 repeats
@freemo @theothersimo @meredithw @Wolven You might hope that, but there are already AIs that assess for professionalism, and they don't design in these proper considerations.
(DIR) Post #AXtU4qFtzamb6qUoxU by freemo@qoto.org
2023-07-20T18:28:12Z
0 likes, 0 repeats
@katrinatransfem Then that is quite concerning.. Can you point me to some links on AI specifically designed for detecting professionalism and any reference to how they dont account for this?@theothersimo @meredithw @Wolven
(DIR) Post #AXtUGoPoexb2sSqDQW by katrinatransfem@mastodon.social
2023-07-20T18:30:20Z
0 likes, 1 repeats
@freemo @theothersimo @meredithw @Wolven https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08Ghttps://www.euronews.com/2023/05/29/if-were-not-careful-ai-recruitment-could-institutionalise-discriminationhttps://www.forbes.com/sites/madelinehalpert/2022/10/09/ai-powered-job-recruitment-tools-may-not-improve-hiring-diversity-experts-argue/https://www.newamerica.org/oti/blog/ai-discrimination-in-hiring-and-what-we-can-do-about-it/And there's a *lot* more where these came from.
(DIR) Post #AXtUSaHZ22kUWGavui by freemo@qoto.org
2023-07-20T18:32:28Z
0 likes, 0 repeats
@katrinatransfem Thanks so much, I will review.