Post 9zMaLt200BjLSHdhTs by zzz@eldritch.cafe
(DIR) More posts by zzz@eldritch.cafe
(DIR) Post #9vh0y2ksEUz3ABUFfM by triz@skull.website
2020-06-02T23:20:48Z
5 likes, 5 repeats
just a note to the programmers on here: if your work involves ai, neural networks, or other expensive algorithms designed to replace human judgement, consider how much it's truly meant to be more equitable and/or efficient and whether it meets that goal. if it doesn't and you're just automating existing judgement at great expense, the purpose of including a neural network is to obfuscate culpabilitythis is not theoretical
(DIR) Post #9vh22L95tmRpJUQd1s by moonman@shitposter.club
2020-06-03T03:17:40.688432Z
1 likes, 0 repeats
@triz it frequently isn't automating human judgment at all, it's just finding patterns you would prefer to ignore. I never told tensor flow "mimic the American brain and its biases"
(DIR) Post #9vh2Z1nCsELcqamGpc by ArdanianRight@freespeechextremist.com
2020-06-03T03:23:35.683162Z
1 likes, 0 repeats
@moonman @triz People who go on about bias in AI never seem to question their own biases.
(DIR) Post #9vh398tfwAtfJ21YGW by moonman@shitposter.club
2020-06-03T03:30:06.163781Z
3 likes, 0 repeats
@ArdanianRight @triz the main bias in ai is thinking your raw data is politically neutral. so yeah you can accidentally code human bias into a trained ai. but in practice I think these people are just mad that their sacred cows are getting skewered.
(DIR) Post #9vh7jUbV6nnQs16yau by clacke@libranet.de
2020-06-03T04:19:34Z
0 likes, 0 repeats
@moonman @triz I think we've seen a decent list of examples where the training data is based on human judgement and the machine learning reinforces it without a way to trace the reasoning.
(DIR) Post #9vh7jUq2ElQFb5uaFU by moonman@shitposter.club
2020-06-03T04:21:27.669241Z
0 likes, 0 repeats
@clacke @triz i mentioned this later
(DIR) Post #9vmcPfpI9wITKQQ45g by wigglytuffitout@elekk.xyz
2020-06-02T23:33:19Z
0 likes, 1 repeats
@triz also: if you are training it from human behavior and you haven't taken into account how that human behavior is flawed (such as being racist, with institutional racism issues like 'applicants with non-white names rejected more often' or 'redlining means no black people in the nice neighborhoods'), you're not making some big elegant solution.you're just fucking up with extra stepswhich means you're bad at your job and your programming is shit and you should feel bad for being bad.
(DIR) Post #9vmcQLZovqsQobVIo4 by activationfxn@todon.nl
2020-06-02T23:35:45Z
0 likes, 1 repeats
@triz someone on here said a large amount of AI is just "intention laundering" for systemic racism and i'll never forget it
(DIR) Post #9wIbiI9GEGRS3phj1M by apLundell@octodon.social
2020-06-21T03:31:29Z
1 likes, 0 repeats
@triz Ah, neural networks : What if we invented a way to give computers a primitive sense of intuition and then deployed it extensively in situations where society has discovered that it's disastrous to make decisions by intuition?
(DIR) Post #9zMZYalL8UaCbeLFEe by hierarchon@inherently.digital
2020-06-03T10:11:51Z
1 likes, 0 repeats
@triz i actually used to work on, like, one of the few morally neutral applications for ML! i don’t want to say what it is, but it’s a domain where each problem generally has an objective answer, ML techniques blow non-ML techniques out of the water, and ML algorithms can solve the problem far faster than a human ever could with comparable accuracy.... i wound up quitting anyway because i was worried about what my project could be used in.
(DIR) Post #9zMaLt200BjLSHdhTs by zzz@eldritch.cafe
2020-06-03T14:51:14Z
0 likes, 0 repeats
@triz The problem is there are several notions of equitable that are themselves in conflict and impossible to reconcile without robust discussions involving non-computer scientistshttps://www.youtube.com/watch?v=jIXIuYdnyykI've made the focus of my ML work to enhance rather than replace human judgement with the understanding that only humans are equipped to make the necessary tradeoffs and even then not all humans.
(DIR) Post #9zMaLtAVUYXHsfcUk4 by publius@mastodon.sdf.org
2020-09-20T22:01:36Z
0 likes, 0 repeats
@zzz @triz I have believed for a long time that engineers need a background in the liberal arts, in order to ask questions like "to what end?" or "sez who?"
(DIR) Post #9zMarwMb9Fqf3ZYhxA by zzz@eldritch.cafe
2020-09-20T22:07:24Z
0 likes, 0 repeats
@publius @triz While this true, I think many projects would benefit in getting input from the non-engineer stakeholders.
(DIR) Post #9zMbreuE9g5groiaYK by publius@mastodon.sdf.org
2020-09-20T22:18:35Z
0 likes, 0 repeats
@zzz @triz Undoubtedly. But that requires things like identifying who those folks are. Being able to put your work in context helps ease, & make sense of, the process of doing all that.
(DIR) Post #9zNonbmjJGliajZPai by zzz@eldritch.cafe
2020-09-21T12:18:11Z
0 likes, 0 repeats
@publius @triz I think you underestimate the very real incentives in an organisation for engineers and management to not have ethical discussions. I say what needs to happen but the incentives will need to be enforced from outside the organisation.
(DIR) Post #9zOCKtKhNzbPW8IZ7I by publius@mastodon.sdf.org
2020-09-21T16:41:57Z
0 likes, 0 repeats
@zzz @triz No, I don't underestimate them. They are formidable. But I think that building up the engineers to think more about their role in society, to demand ethical conduct from the people with the money, is very important ― even if we do find a way to change our society so that managers & financiers aren't mostly actual or functional sociopaths.
(DIR) Post #9zOF0x7DucWhJtBBYW by zzz@eldritch.cafe
2020-09-21T17:11:55Z
0 likes, 0 repeats
@publius I want to shift the focus from something so centred on the engineers. Yes we should demand more from them. But, if I'm saying hey before selling software to the police maybe talk to community leaders over how they feel about that.The stakeholders won't always be people in your company. They won't always be the people with the money. There is a role to play for community organisations and asking they be consulted before the city buys some tech.
(DIR) Post #9zOIWkmZbLqgCNNNC4 by zzz@eldritch.cafe
2020-09-21T17:19:36Z
0 likes, 0 repeats
@publius Putting the onus on engineers also has a paternalistic air as it implies only engineers know how best to prevent a technology from harming people. I have fairly hard evidence that engineers are bad at anticipating the harm some technology can do particularly if the people don't look like them.That's why I recommend external organisations (which should include engineers) lead the charge and legislation be used to make them official stakeholders.
(DIR) Post #9zOIWkvR4OwCdrWS0W by publius@mastodon.sdf.org
2020-09-21T17:51:18Z
0 likes, 0 repeats
@zzz That's not how I see it at all. Engineers have a tendency to think about the things they are creating, or preparing to create. Other people don't have as much reason to think about those things, or may not be aware of them at all. Even when they do, they usually have such poor or partial information that they can easily end up making judgements which are not congruent with reality. Thus, the onus does fall first & foremost on those who have some clear idea of what is going on.(cont'd)
(DIR) Post #9zOIgcIG1BEqlZtnuK by publius@mastodon.sdf.org
2020-09-21T17:53:06Z
0 likes, 0 repeats
@zzz Once we admit, which the engineer is usually quite willing to do, that he bears responsibility for what he brings into the world, everyone can agree that we will all be better off if he is equipped with better mental tools for approaching that part of his work. Everyone, that is, except the sociopaths ― which is a good indication that this is part of the work of freezing them out!
(DIR) Post #9zOJI7m5vza17Ojxho by zzz@eldritch.cafe
2020-09-21T17:59:52Z
0 likes, 0 repeats
@publius I don't oppose engineers having the tools to make sound ethical decisions. I'm saying that knowledge necessary to do so will not always be available to the engineer, and again there are real incentives to keep engineers from making ethical decisions.Without systemic changes, it simply doesn't matter how good your engineer is. They will not prevail. They will not identify who their technology harms. They will not have the political power to tell their manager to fuck off.
(DIR) Post #9zOLLIA7yBL470cTjs by publius@mastodon.sdf.org
2020-09-21T18:22:51Z
0 likes, 0 repeats
@zzz What I am talking about is better equipping people for burdens they are already bearing. Obviously that is not enough, but without it, I don't expect anything else to work very well."Telling managers to fuck off" is a big deal, but very rarely is your manager the person driving a project. People who control economic resources have decided, for whatever reason, that they want this. What external mechanisms such as you suggest are best for is dealing with that.