Post AlDocLGrw2AgI0Fz9M by TealDeer@www.minds.com
(DIR) More posts by TealDeer@www.minds.com
(DIR) Post #AlDocLGrw2AgI0Fz9M by TealDeer@www.minds.com
2024-08-22T08:54:02+00:00
1 likes, 0 repeats
I have 200 products.100 of product A100 of product BOn X day I sell 20 of product AOn X day I sell 20 of product BI take inventory of my products.I expect to have 80 of product AI expect to have 80 of product BI find I have 80 of product AI find I have 60 of product B20 units of product B are unaccounted for.The first inventory of 100 units was correct. There have been no breakages/accidents/fires etc that could account for the loss of 20 units of product B.The conclusion to this is that 20 units of product B have been stolen.If I have any ability to reduce theft of my inventory. I should place that measure on product B, because product B has shown a higher likelihood of being stolen.This is so simple that a basic computer algorithm can do it.The "problem" arises when product B has been designated as something "belonging" to non-white people. If that's the case, you're now a racist for stopping them from stealing from you.Things like this are why there are dangerously over paid "experts" telling you computers are "racist" and "white supremacist" because they show you just the facts of reality. Like where crime happens and thus where crime preventative measures should be implemented. Tagging, surveillance, police patrols etc.We literally have to code impartial algorithms with racial bias to "correct for" the reality that crime is not evenly distributed.---Maybe they commit more crimes.
(DIR) Post #AlDocM6yoHr4tcdbNI by fal1026@poa.st
2024-08-22T10:09:56.539381Z
0 likes, 1 repeats
@TealDeer I'm imagining going to grade school again, & this is the exact lesson taught in non-shithole locations.