Posts by karadoc@aus.social
 (DIR) Post #AjOz3xyOC7ugBqhmHQ by karadoc@aus.social
       2024-06-28T23:44:34Z
       
       0 likes, 0 repeats
       
       @ZachWeinersmith Robots like that would be transformative; but unless there had already been an even greater transformation of our economic systems, then that would certainly mean increasing inequality.Just imagine who is buy and using the AI services you've described. Will it be the people who are talking about cost of living pressures? Probably not. Probably its a luxury for the rich, ultimately paid for by the labour of the poor who are still just trying to buy food and somewhere to live.I don't see how anything good can come from this until there is *major* economic reform.
       
 (DIR) Post #AjXBbM60HT1pHCdLSy by karadoc@aus.social
       2024-07-02T22:42:45Z
       
       0 likes, 0 repeats
       
       @ZachWeinersmith Well yeah. That's why people thought to try artificial neural networks in the first place - to roughly mimic the neural network of our own brains.Of course, we shouldn't just expect magic to appear though. Just like you can't get a functional human by just sticking a bunch of brains together, we probably can't get a functional AI by just sticking artificial neural networks together. (But that's why people are doing systematic experimentation with how layers interact, and how to pre-process inputs etc.)I'd personally find the whole thing extremely interesting and exciting if we weren't staring down the barrel of a dystopian nightmare. Money corrupts everything. :(
       
 (DIR) Post #AjioGiQs4YEDQE6QG8 by karadoc@aus.social
       2024-07-08T13:17:33Z
       
       0 likes, 0 repeats
       
       @protonprivacy I wouldn't expect any AI-focused company to move to homomorphic computing. They can profit from personal data - and so they're unlikely to spend extra resources to prevent themselves collecting it. And from the users' side, if told "it's encrypted"; it's still difficult to know who can and cannot decrypt it anyway. The whole thing is complex and opaque.So I'd say that local computation is the only near-term solution for that kind of privacy.And for the *other* privacy problem, we seem to have no solution at all. i.e. The problem of personal data being used for training regardless of the source or the person's wishes. Leading to things like this:https://arstechnica.com/tech-policy/2024/07/ai-trains-on-kids-photos-even-when-parents-use-strict-privacy-settings/
       
 (DIR) Post #Arq9lhKjld79Xo6aBs by karadoc@aus.social
       2025-03-08T08:36:05Z
       
       0 likes, 0 repeats
       
       @Daojoan apparently the word "Luigi" also gets flagged, in any context.
       
 (DIR) Post #Az6zwh1bQvt0BwhEgq by karadoc@aus.social
       2025-10-11T21:20:27Z
       
       0 likes, 0 repeats
       
       So, another day, another leak of 70000 people's government IDs, from Discord this time.It seems to me that websites shouldn't be *allowed* to collect personal information unless it is absolutely necessary (an address so that they can delver a package).  But we instead seem to be moving in the opposite direction with Governments around the world demanding that various websites collect ID for age verification. This is bad.https://arstechnica.com/security/2025/10/discord-says-hackers-stole-government-ids-of-70000-users/#it #security