(C) BoingBoing This story was originally published by BoingBoing and is unaltered. . . . . . . . . . . "We will find you: How a flawed algorithm terrorized welfare recipients [1] ['Mark Frauenfelder'] Date: 2025-06-25 An automated welfare fraud detection system in Australia falsely accused hundreds of thousands of innocent people of theft, leading to multiple suicides and costing taxpayers $1.8 billion in settlements. As reported in Don Moynihan's analysis, from 2016 to 2020 the "Robodebt" system used faulty algorithms to identify supposed welfare overpayments. Many vulnerable Australians received notices demanding thousands of dollars in repayment with little explanation or recourse. The burden of proof was placed on welfare recipients, who had to navigate a user-hostile online system to contest claims. When individuals were directed to the website, they found a document in legalistic language reiterating the notice letter and providing more information about the policy. They were asked to scroll through the document on the browser and click "Next" at the bottom of the screen for additional steps. Many clicked "Next" without carefully reading the entire document, eager to learn what options they might have for contesting the charge. However, at the end of the screen was text explaining that by clicking "Next" the person admitted to the overpayment and consented to a repayment plan. Those who were confused, anxious, or did not understand the agreement and clicked "Next" were unable to contest. In other words, people who examined online explanations in the way that people typically do—to quickly skim and move on— inadvertently admitted guilt. The program's aggressive tone was exemplified by Human Services Minister Alan Tudge, who declared on television: "We will find you, we will track you down, and you will have to repay those debts, and you may end up in prison." Panicked recipients of the notices took out high-interest loans, sold possessions, and faced aggressive debt collectors. The psychological impact included "stress, trauma, depression, suicidal ideation, and in at least two cases, were associated with suicide," Moynihan writes. Despite early warnings from frontline staff and advocacy groups, government officials pressed ahead, focused on projected savings rather than accuracy. A caseworker described the moral burden of enforcing the system: "doing myself damage by continuing to work within an unfair system of oppression that I thought was designed to get people rather than support them, which was continuing to injure people every day, and which transgressed my own moral and ethical values." The politicians who cooked up the ill-fated program shifted the blame onto the public servants who were forced to use it: After the scandal could not be ignored senior public servants were reprimanded and referred to disciplinary action by the Australian Public Service Commission for failing to incorporate information about the risks of the scheme to their political superiors. Moynihan concludes: "The public really, really does not like governments using automated systems to punish them." Previously: • Algorithms in college admissions could make discrimination worse • Fired by an algorithm, and no one can figure out why • Should I use an algorithm here? EFF's 5-point checklist [END] --- [1] Url: https://boingboing.net/2025/06/25/we-will-find-you-how-a-flawed-algorithm-terrorized-welfare-recipients.html Published and (C) by BoingBoing Content appears here under this condition or license: Creative Commons BY-NC-SA 3.0. via Magical.Fish Gopher News Feeds: gopher://magical.fish/1/feeds/news/boingboing/