[HN Gopher] Sentenced by Algorithm
___________________________________________________________________
Sentenced by Algorithm
Author : prostoalex
Score : 25 points
Date : 2021-08-02 02:59 UTC (20 hours ago)
(HTM) web link (www.nybooks.com)
(TXT) w3m dump (www.nybooks.com)
| Nasrudith wrote:
| Really thoughts for the topic remain as always: stop letting the
| people in power pass the buck to the algorithim. That is the
| entire purpose of those algorithms as implemented in the real
| world as opposed to some pie in the sky theorist - letting them
| escape responsibility.
|
| I will also note that algorithimic sentencing existed before
| computers with sentencing guidelines.
| vinsci wrote:
| Actually, the headline could be more technically correct written
| as "sentenced by algorithm executing on backdoored remotely
| controlled computer", as that is the case today.
|
| Enjoy your dystopia of greed, fraud, and injustice.
| [deleted]
| onos wrote:
| It seems the appropriate method to judge this approach is in
| comparison to the baseline referenced at the start: humans making
| "black box" judgement calls that are subject to the individual
| judge's biases. If the computer programmed decision making were
| (already is?) transparent, then we could critique it and work to
| ensure something resembling fair is encoded.
| ljm wrote:
| How would you define 'fair'? I've a few examples, gathered from
| being alive and hearing what other people think for a few
| decades.
|
| 1. It's fair to punish someone else if you let another off the
| hook
|
| 2. It's fair to treat someone more harshly if you don't like
| what they've done
|
| 3. It's fair to turn the other cheek because they're a friend
|
| 4. It's fair to treat someone differently because of where they
| came from or their skin colour
|
| 5. It's fair to give someone a pass because they did you a
| favour
|
| 6. It's fair to be unfair because you owe someone a favour
|
| 7. It's unfair to be a victim who doesn't get justice
|
| 8. It's unfair to be an innocent person who is prosecuted for
| someone else's crime
|
| 9. It's unfair to not get what you wait
|
| 10. It's fair to get what you want
|
| You will find a variation of all of these examples across the
| world right now.
|
| The point is, fairness is loaded with bias and therefore any
| algorithm that tries to deal with 'fairness' is going to
| inherit the bias of those who designed it.
| nickthemagicman wrote:
| You make really good points. Life is bizarre and human
| culture is extremely contradictory and difficult to quantify.
|
| And what's worse our definition of 'fair' maybe advancing
| just like culture is constantly advancing.
|
| African Americans may have been treated poorly by the
| algorithm back in the 1700s when they we're considered less
| than a full person by the legal system for example. And maybe
| in the future drug offenses may be considered not a big deal.
|
| They call the Constitution a living document.
|
| This algorithm may have to be a living algorithm.
|
| What if it was open source and constantly updated and
| reviewed to reach consensus?
|
| There would probably also have to be human appeal processes.
|
| I definitely think this is intriguing as a first-level
| sentencing determination though.
| ljm wrote:
| > African Americans may have been treated poorly by the
| algorithm back in the 1700s
|
| Fucking hell, man.
| geephroh wrote:
| "Computer programmed decision making" is anything but
| transparent currently. Just look at the example of Northpointe,
| Inc.'s COMPAS ("Correctional Offender Management Profiling for
| Alternative Sanctions") tool, which purports to measure
| recidivism risk. Even under threat of lawsuit, Northpointe
| refused to reveal the underlying source because it is a
| "protected trade secret."[1]
|
| Are judges biased? Of course, and that is not acceptable.
| However, the answer is not to surrender our system of justice
| to unaccountable commercial actors.
|
| 1. https://www.uclalawreview.org/injustice-ex-machina-
| predictiv...
___________________________________________________________________
(page generated 2021-08-02 23:00 UTC)