Post 9lyjt8dTNAt6VMiTA0 by jerry@infosec.exchange
(DIR) More posts by jerry@infosec.exchange
(DIR) Post #9lxFwzqOoGmgc7j5LE by leip4Ier@infosec.exchange
2019-08-16T20:27:56Z
0 likes, 0 repeats
#gdpr requires a company to delete *all* data about a user upon request, right? so.. if a company is feeding all users' behavioral data into a neural network. does it have to machine unlearn what it learned based on data of the user who wants to delete their account?
(DIR) Post #9lxM2bge8dz39bDsXY by b3cft@infosec.exchange
2019-08-16T21:36:10Z
0 likes, 0 repeats
@leip4Ier interesting point but I doubt it. The PII *should* be gone, legislators have a hard enough time understanding basic tech let alone the intracacies of ML. As you generally can't interrogate a ML model on how it actually works, I doubt you could prove either way.
(DIR) Post #9lxlpFghDX13y5XSMq by jerry@infosec.exchange
2019-08-17T02:25:04Z
0 likes, 0 repeats
@leip4Ier this is a problem I’ve worked on. The answer is, of course, “it depends”. If your personal data are anonymized, then no, your right to be forgotten doesn’t apply, as that’s not considered personal data any longer. (Note: EU Parliament seems oblivious to the limitations of anonymization, but much of the law highlights their collective lack of technical sophistication)So, there are cases where your data SHOULD be deleted, but it’ll be hard to determine where and where not. 1/2
(DIR) Post #9lxmFWWPFdhzhuFjzU by jerry@infosec.exchange
2019-08-17T02:29:49Z
0 likes, 0 repeats
@jerry @leip4Ier on the other hand, some machine learning systems will necessarily transform your data in ways that makes it less and less specific to you (depending on the application, of course - some may, in fact, create MORE data about you through inferences)I think this is a blind spot the regulation leaves: as a data subject, you have a right to know what data a firm has about you, and then the right (with certain notable restrictions) to have that data deleted. But you are reliant 2/3
(DIR) Post #9lxmRUz6n9YEPnO8kC by jerry@infosec.exchange
2019-08-17T02:31:58Z
0 likes, 0 repeats
@jerry @leip4Ier on that firm’s judgement of whether the neural net/ML/AI system contains your personal data or only anonymized personal data. Even an honest, diligent company could get that wrong, though. 3/2
(DIR) Post #9ly8lm1PM86U2PXLKC by WPalant@infosec.exchange
2019-08-17T06:42:11Z
0 likes, 0 repeats
@leip4Ier Actually, GDPR doesn't require a company to delete all data - it might still keep some necessary data. For example, there might be an outstanding bill in which case they still need your data. Even after that bill is paid, they might have to keep some data to satisfy audit requirements.But obviously, in case of machine learning they will argue that it isn't personal data - it cannot be associated with any person any more. Unless https://xkcd.com/2169/ applies...
(DIR) Post #9lyS0jzh4iqZDpaMQi by leip4Ier@infosec.exchange
2019-08-17T10:17:47Z
0 likes, 0 repeats
@WPalant @jerry yeah, i was thinking about such cases.it depends on what their ML system is like, and sometimes they themselves may not know. or could just say whatever they want, i guess, as no one has a reliable way to find out what a neural network stores inside...ugh, i kind of like the general spirit of gdpr, and many of those rants about how bad it is said more about the company trying to implement it and not the law itself. but ig it's very very far from perfect and/or unambiguous.
(DIR) Post #9lyS5ZCSdPihC0rHLE by leip4Ier@infosec.exchange
2019-08-17T10:18:40Z
0 likes, 0 repeats
@WPalant @jerry but also seems like modern technologies are often inherently orthogonal to privacy, that's sad
(DIR) Post #9lyScLZQKXT0TfEJFI by leip4Ier@infosec.exchange
2019-08-17T10:24:34Z
0 likes, 0 repeats
@WPalant @jerry but also nothing stops a company from hiding some kinds of information stored about you? current world order is kinda orthogonal to privacy, too?i mean, even google has that takeout thing, and obviously it doesn't export all the information about you. the exported archive, afaik, doesn't store anything about what the ads targeting algorithms learned about a person. is there a way to sue them for that, are there at least theoretical chances to win that lawsuit?
(DIR) Post #9lySmRVV72HrmmD12e by leip4Ier@infosec.exchange
2019-08-17T10:26:25Z
0 likes, 0 repeats
@WPalant (i think what they learned isn't inside an ML model, but is rather a list of keywords, etc)
(DIR) Post #9lyVS6S5z5PaYWbpCq by jerry@infosec.exchange
2019-08-17T10:56:19Z
0 likes, 0 repeats
@leip4Ier @WPalant the fear of a massive fine is supposed to motivate companies to act responsibly and honestly. The GDPR does establish US-style class action lawsuits as a means for data subjects to seek compensation (i have mixed feelings - they seem like a good idea, but in the US, the lawyers are the only winners in any case)
(DIR) Post #9lyVVgwEmnQmaIQ20e by jerry@infosec.exchange
2019-08-17T10:56:58Z
0 likes, 0 repeats
@leip4Ier @WPalant I would like to say that I am so happy to know someone else that uses the word “orthogonal”!
(DIR) Post #9lyWP5b4o6WMzNw8QK by jerry@infosec.exchange
2019-08-17T11:06:58Z
0 likes, 0 repeats
@jerry @leip4Ier @WPalant as to whether the world is orthogonal to privacy - that’s a certainly true, on its face. But, it in many contexts, there is a trade off. Data collecting apps can help us live healthier lives by motivating us to exercise or identifying a possible medical problem early. Credit scores, built on personal data, enable people to borrow money to make large purchases. and many other cases. But for every “good” use case, there are a bunch of bad ones.
(DIR) Post #9lyd7AM94bhgDSLyAi by WPalant@infosec.exchange
2019-08-17T12:22:12Z
0 likes, 0 repeats
@leip4Ier @jerry In theory, there is a risk associated with hiding data. For example, if you ever use that data, people might notice. Or an audit discovers it. Or hackers steal it. Or a whistleblower tips off the press. And then you might be facing a massive fine.The open question currently is: how serious is that risk? My impression is that the big companies didn't take GDPR too seriously, they want to see how much they can get away with. And currently enforcement is indeed lagging behind.
(DIR) Post #9lyjt8dTNAt6VMiTA0 by jerry@infosec.exchange
2019-08-17T13:38:04Z
0 likes, 0 repeats
@WPalant @leip4Ier There were over 200,000 notifications to data protection authorities in the first 9 months of the GDPR (both reported breaches and day subject complaints). I would guess it’s probably close to 300,000 now, but I can’t find any more recent data. There’s absolutely no way to investigate and adjudicate potential violations of a law at that scale, considering that each one could turn into a protracted investigation and legal battle. Good time to be a lawyer, I guess.
(DIR) Post #9lzA95zfYqGVkiMtGq by leip4Ier@infosec.exchange
2019-08-17T18:32:17Z
0 likes, 0 repeats
@jerry @WPalant yeah, these all are somewhat legitimate use cases. you can opt out of some of them, although credit scores are hardly avoidable in some cases and i hate it.but there's also that thing, laws give so much privacy to corporations (intellectual property protection, NDAs and related), and they use that privacy to violate people's privacy. like, proving that a company is doing bad things is so hard, everything is very opaque and the company won't let anyone see its internal docs...
(DIR) Post #9lzAEKuFXm2VS1KeRc by leip4Ier@infosec.exchange
2019-08-17T18:33:16Z
0 likes, 0 repeats
@jerry @WPalant it's kinda understandable why governments do it, as individuals pay much less taxes than those companies, but... it is so wrong, imo.
(DIR) Post #9lzAG2AACa9rtmaLKK by leip4Ier@infosec.exchange
2019-08-17T18:33:32Z
0 likes, 0 repeats
@jerry @WPalant why though, is it so rare? ^^"
(DIR) Post #9lzBCTMrDI2F4JO8bw by leip4Ier@infosec.exchange
2019-08-17T18:44:07Z
0 likes, 0 repeats
@WPalant @jerry then, "but what if nobody discovers it! let's hope so!" is, it seems, the only thought of all the big companies about this law... maybe also "our lawyers are good enough". and from what i heard, google-scale corporations were mainly talking about how to not get fined and not even how to become better in the eyes of the public.i guess it says a lot about the court system then, there's a law, there're companies that clearly break the law, and they don't even care much about it
(DIR) Post #9m0DmPA4iu3FVydBaq by WPalant@infosec.exchange
2019-08-18T06:47:44Z
0 likes, 0 repeats
@leip4Ier @jerry I think it's more of: until somebody can collect sufficient proof and make it through all the court instances, five years will have passed. By that time they will have made more than enough profits to offset the losses. As long as these lawsuits are a big exception, the risk calculation will stay like that.
(DIR) Post #9m0IDAqcRuyukyYCSO by leip4Ier@infosec.exchange
2019-08-18T07:37:24Z
0 likes, 0 repeats
@WPalant @jerry oh. which also says a lot about how broken the court system is.it's great that you can appeal to a higher court, when the situation is difficult. but maybe corporations earning more than $100k/yr shouldn't be allowed to do that, bc they do it all the time, even if the case is really obvious.
(DIR) Post #9m0IOirmJDLLRubEfY by leip4Ier@infosec.exchange
2019-08-18T07:39:29Z
0 likes, 0 repeats
@WPalant @jerry also it should be illegal to take a person to courts many times (without winning the cases) just as a form of revenge, it's completely fucked up that it's possible
(DIR) Post #9m0aZfbKaVJRgNHC3k by WPalant@infosec.exchange
2019-08-18T11:03:09Z
0 likes, 0 repeats
@leip4Ier @jerry At least the German court system doesn't always allow appeals. However, the case is rarely so clear that the court will deny appeal. It's really hard to draw this line between getting a fair trial and misuse of the court system. It's the same with suing the same person multiple times - sometimes it is necessary.