[HN Gopher] Judge denies creating "mass surveillance program" ha...
       ___________________________________________________________________
        
       Judge denies creating "mass surveillance program" harming all
       ChatGPT users
        
       Author : merksittich
       Score  : 156 points
       Date   : 2025-06-23 18:18 UTC (4 hours ago)
        
 (HTM) web link (arstechnica.com)
 (TXT) w3m dump (arstechnica.com)
        
       | merksittich wrote:
       | Previous discussion:
       | 
       | OpenAI slams court order to save all ChatGPT logs, including
       | deleted chats
       | 
       | https://news.ycombinator.com/item?id=44185913
        
       | aydyn wrote:
       | The judge is clearly not caring about this issue so arguing
       | before her seems pointless. What is the recourse for OpenAI and
       | users?
        
         | Analemma_ wrote:
         | You don't have any recourse, at least not under American law.
         | This a textbook third-party doctrine case: American law and
         | precedent is unambiguous that once you voluntarily give your
         | data to a third party-- e.g. when you sent it to OpenAI-- it's
         | not yours anymore and you have no reasonable expectation of
         | privacy about it. Probably people are going to respond to this
         | with a bunch of exceptions, but those exceptions all have to be
         | enumerated and granted specifically with new laws; they don't
         | exist by default, and don't exist for OpenAI.
         | 
         | Like it or not, the judge's ruling sits comfortably within the
         | framework of US law as it exists at present: since there's no
         | reasonable expectation of privacy for chat logs sent to OpenAI,
         | there's nothing to weigh against the competing interest of the
         | active NYT case.
        
           | anon7000 wrote:
           | Yep. This is why we need constitutional amendments or more
           | foundational laws around privacy that changes this default.
           | Which _should_ be a bipartisan issue, if money had less
           | influence in politics.
        
             | AnthonyMouse wrote:
             | This is the perverse incentives one rather than the money
             | one. The judges want to order people to do things and the
             | judges are the ones who decide if the judges ordering
             | people to do things is constitutional.
             | 
             | To prevent that you need Congress to tell them no, but that
             | creates a sort of priority inversion: The machinery
             | designed to stop the government from doing something bad
             | unless there is consensus is then enabling government
             | overreach unless there is consensus to stop it. It's kind
             | of a design flaw. You want checks and balances to stop the
             | government from doing bad things, not enable them.
        
           | impossiblefork wrote:
           | OpenAI is the actual counterparty here though and not a third
           | party. Presumably their contracts with their users are still
           | enforceable.
           | 
           | Furthermore, if the third party doctrine is upheld in its
           | most naive form, then this would breach the EU-US Data
           | Privacy Framework. The US must ensure equivalent privacy
           | protections to those under the GDPR in order for the
           | agreement to be valid. The agreement also explicitly forbids
           | transferring information to third parties without informing
           | those whose information is transferred.
        
             | yxhuvud wrote:
             | Well, I don't think anyone is expecting the framework to
             | work this time either after earlier tries has been
             | invalidated. It is just panicked politicians trying to kick
             | the can to avoid the fallout that happens when it can't be
             | kicked anymore.
        
               | impossiblefork wrote:
               | Yes, and I suppose the courts can't care that much about
               | executive orders. Even so, one would think that they had
               | some sense and wouldn't stress things that the
               | politicians have built.
        
               | freejazz wrote:
               | 3rd party doctrine in the US is actual law... so I'm not
               | sure what's confusing about that. The president has no
               | power to change discovery law. That's congress. Why would
               | a judge abrogate US law like that?
        
             | mrweasel wrote:
             | They probably do already, but won't this ruling force
             | OpenAI to operate separate services for the US and EU? The
             | US users must accept that their logs are stored
             | indefinitely, while an EU user is entitled to have theirs
             | delete.
        
           | like_any_other wrote:
           | > once you voluntarily give your data to a third party-- e.g.
           | when you sent it to OpenAI-- it's not yours anymore and you
           | have no reasonable expectation of privacy about it.
           | 
           | The 3rd party doctrine is worse than that - the data you gave
           | is not only not yours anymore, it is not _theirs_ either, but
           | the governments. They 're forced to act as a government
           | informant, _without any warrant requirements_. They can _say_
           | "we will do our very best to keep your data confidential",
           | and contractually bind themselves to do so, but hilariously,
           | in the Supreme Court's wise and knowledgeable legal view,
           | this does not create an "expectation of privacy", despite
           | whatever vaults and encryption and careful employee vetting
           | and armed guards standing between your data and unauthorized
           | parties.
        
             | kopecs wrote:
             | I don't think it is accurate to say that the data _becomes_
             | the government 's or they have to act as an informant (I
             | think that implies a bit more of an active requirement than
             | responding to a subpoena), but I agree with the gist.
        
           | comex wrote:
           | The third-party doctrine has been weakened by the Supreme
           | Court recently, in United States v. Jones and Carpenter v.
           | United States. Those are court decisions, not new laws passed
           | by Congress. See also this quote:
           | 
           | https://en.wikipedia.org/wiki/Third-
           | party_doctrine#:~:text=w...
           | 
           | If OpenAI doesn't succeed at oral argument, then in theory
           | they could try for an appeal either under the collateral
           | order doctrine or seeking a writ of mandamus, but apparently
           | these rarely succeed, especially in discovery disputes.
        
             | otterley wrote:
             | Justice Sotomayor's concurrence in _U.S. v. Jones_ is not
             | binding precedent, so I wouldn 't characterize it as
             | weakening the third-party doctrine yet.
        
           | AnthonyMouse wrote:
           | > You don't have any recourse, at least not under American
           | law.
           | 
           | Implying that the recourse is to change the law.
           | 
           | Those precedents are also fairly insane and not even
           | consistent with one another. For example, the government
           | needs a warrant to read your mail in the possession of the
           | Post Office -- not only a third party but actually part of
           | the government -- but not the digital equivalent of this when
           | you transfer some of your documents via Google or Microsoft?
           | 
           | This case is also _not_ the traditional third party doctrine
           | case. Typically you would have e.g. your private project
           | files on Github or something which Github is retaining for
           | reasons independent of any court order and then the court
           | orders them to provide them to the court. In this case the
           | judge is ordering them to retain third party data they wouldn
           | 't have otherwise kept. It's not clear what the limiting
           | principle there would be -- could they order Microsoft to
           | retain any of the data on everyone's PC that _isn 't_ in the
           | cloud, because their system updater gives them arbitrary code
           | execution on every Windows machine? Could they order your
           | home landlord to make copies of the files in your apartment
           | without a warrant because they have a key to the door?
        
         | mvdtnz wrote:
         | "OpenAI user" is not an inherent trait. Just use another
         | product, make it OpenAI's problem.
        
         | dylan604 wrote:
         | appealing whatever ruling this judge makes?
        
           | otterley wrote:
           | You can't appeal a case you're not a party to.
        
             | baobun wrote:
             | It's a direct answer to the question what recourse OpenAI
             | has.
             | 
             | Users should stop sending information that shouldn't be
             | public to US cloud giants like OpenAI.
        
               | otterley wrote:
               | Do you really think a European court wouldn't similarly
               | force a provider to preserve records in response to being
               | accused of destroying records pertinent to a legal
               | dispute?
        
               | baobun wrote:
               | Fundamentally, on-prem or just foregoing is the safest
               | way, yes. If one still uses these remote services it's
               | also important to be prudent about exactly what data you
               | share with them when doing so[0]. Note I did not say
               | "Send your sensitive data to these countries instead".
               | 
               | The laws still look completely different in US and EU
               | though. EU has stronger protections and directives on
               | privacy and weaker supremacy of IP owners. I do not
               | believe lawyers in any copyright case would get access to
               | user data in a case like this. There is also a gap in the
               | capabilities and prevalence of govt to force individual
               | companies or even employees to insert and maintain secret
               | backdoors with gag orders outside of court (though parts
               | of the EU seem to be working hard to close that gap
               | recently...).
               | 
               | [0]: Using it to derive baking recipes is not the same as
               | using it to directly draft personal letters. Using it
               | over VPN with pseudonym account info is not the same as
               | using it from your home IP registered to your personal
               | email with all your personals filled out and your credit
               | card linked. Running a coding agent straight on your
               | workstation is different to sandboxing it yourself to
               | ensure it can only access what it needs.
        
         | FpUser wrote:
         | >"What is the recourse for OpenAI and users?"
         | 
         | Start using services of countries who are unlikely to submit
         | data to the US.
        
         | freejazz wrote:
         | Stop giving your information to third parties with the
         | expectation that they keep it private when they wont and
         | cannot. Your banking information is also subject to subpoena...
         | I don't see anyone here complaining about that. Just the hot
         | legal issue of the day that programmers are intent on
         | misunderstanding.
        
       | RcouF1uZ4gsC wrote:
       | ""Proposed Intervenor does not explain how a court's document
       | retention order that directs the preservation, segregation, and
       | retention of certain privately held data by a private company for
       | the limited purposes of litigation is, or could be, a 'nationwide
       | mass surveillance program,'" Wang wrote. "It is not. The
       | judiciary is not a law enforcement agency.""
       | 
       | This is a horrible view of privacy.
       | 
       | This gives unlimited ability for judges to violate the privacy
       | rights of people while stating they are not law enforcement.
       | 
       | For example, if the New York Times sues that people using an a no
       | scripts addin, are bypassing its paywall, can a judge require
       | that the addin collect and retain all sites visited by all its
       | users and then say its ok because the judiciary is not a law
       | enforcement agency?
        
         | Analemma_ wrote:
         | > This gives unlimited ability for judges to violate the
         | privacy rights of people while stating they are not law
         | enforcement.
         | 
         | See my comment above in reply to aydyn: in general, "privacy
         | rights" do not exist in American law, and as such the judge is
         | violating nothing.
         | 
         | People are always surprised to learn this, but it's the truth.
         | There's the Fourth Amendment, but courts have consistently
         | interpreted that very narrowly to mean your personal effects in
         | your possession are secure against seizure specifically by the
         | government. It does not apply to data you give to third-
         | parties, under the third-party doctrine. There are also various
         | laws granting privacy rights in specific domains, but those
         | only apply to the extent of the law in question; there is no
         | constitutional right to privacy and no broad law granting it
         | either.
         | 
         | Until that situation changes, you probably shouldn't use the
         | term "privacy rights" in the context of American law: since
         | those don't really exist, you'll just end up confusing yourself
         | and others.
        
         | lovich wrote:
         | You don't have privacy rights once you hand over your data to a
         | third party.
         | 
         | This isn't a new issue OpenAI is forcing the courts to wrestle
         | with for the first time.
        
       | strictnein wrote:
       | Maybe it's the quotes selected for the article, but it seems like
       | the judge simply doesn't get the objections. And the reasoning is
       | really strange:
       | 
       | "Even if the Court were to entertain such questions, they would
       | only work to unduly delay the resolution of the legal questions
       | actually at issue."
       | 
       | So because the lawsuit pertains to copyright, we can ignore
       | possible constitutional issues because it'll make things take
       | longer?
       | 
       | Also, rejecting something out of hand simply because a lawyer
       | didn't draft it seems really antithetical to what a judge should
       | be doing. There is no requirement for a lawyer to be utilized.
        
         | judofyr wrote:
         | > ... but it seems like the judge simply doesn't get the
         | objections. And the reasoning is really strange
         | 
         | The full order is linked in the article:
         | https://cdn.arstechnica.net/wp-
         | content/uploads/2025/06/NYT-v.... If you read that it becomes
         | more clear: The person who complained here filed a specific
         | "motion to intervene" which has a strict set of requirements.
         | These requirements were not met. IANAL, but it doesn't seem too
         | strange to me here.
         | 
         | > Also, rejecting something out of hand simply because a lawyer
         | didn't draft it seems really antithetical to what a judge
         | should be doing. There is no requirement for a lawyer to be
         | utilized.
         | 
         | This is also mentioned in the order: An individual have the
         | right to represent themselves, but a corporation does not. This
         | was filed by a corporation initially. The judge did exactly
         | what a judge what supposed to do: Interpret the law as written.
        
           | yieldcrv wrote:
           | So the arguments are sound but the procedure wasnt followed
           | so someone else just needs to follow the procedure and get
           | our chats deletable?
        
             | otterley wrote:
             | The judge went further to say the arguments weren't sound,
             | either.
        
               | Aeolun wrote:
               | They went further and said _they_ didn't find the
               | arguments sound.
               | 
               | Personally I find it really hard to see anything sound in
               | the initial order.
        
         | Alive-in-2025 wrote:
         | That appears to be the case to me too. There are so many
         | similar services that people use that could fall victim to this
         | same issue of privacy . We should have extremely strong privacy
         | laws preventing this, somehow blocking over-broad court orders.
         | And we don't. Imagine these case:
         | 
         | 1. a court order that a dating service saves off all chats and
         | messages between people connecting on the service (instead of
         | just saving off say the chats from a suspected abuser)
         | 
         | 2. saving all text messages going through a cell phone company
         | 
         | 3. how about saving all google docs? Probably billions of these
         | a day are being created.
         | 
         | 4. And how has the govt not tried to put out a legal request
         | that signal add backdoors and save all text messages (because
         | there will no doubt be nefarious users like our own secretary
         | of defense). I think it would take a very significant reason to
         | succeed against a private organization like signal.
         | 
         | The power and reach of this makes me wonder if the US govt
         | already has been doing this to normal commercial services
         | (outside of phone calls and texting). I recall reading back in
         | the day they were "tapping" / legally accessing through some
         | security laws phone company trunks. And then we learned about
         | tapping google communications from Edward Snowden.
        
           | chrisweekly wrote:
           | "saving off"?
           | 
           | this is a strange turn of phrase
        
           | salawat wrote:
           | >2. saving all text messages going through a cell phone
           | company
           | 
           | Point of order, phone companies _already do that_. Third
           | Party Doctrine. I don 't believe they _should_ , but as of
           | right now, that's where we're at.
        
           | delusional wrote:
           | > We should have extremely strong privacy laws preventing
           | this, somehow blocking over-broad court orders
           | 
           | Quick question. Should your percieved "right to privacy"
           | supersede all other laws?
           | 
           | To extrapolate into the real world. Should it be impossible
           | for the police to measure the speed of your vehicle to
           | protect your privacy? Should cameras in stores be unable to
           | record you stealing for fear of violating your privacy?
        
             | gridspy wrote:
             | Your two examples don't map to the concern about data
             | privacy.
             | 
             | Speed cameras only operate on public roads. The camera in
             | the store is operated by the store owner. In both cases one
             | of the parties involved in the transaction (driving,
             | purchasing) is involved in enforcement. It is clear in both
             | cases that these measures protect everyone and they have
             | clear limits also.
             | 
             | Better examples would be police searching your home all the
             | time, whenever they want (This maps to device encryption).
             | 
             | Or store owners surveilling competing stores / forcing
             | people to wear cameras 24/7 "to improve the customer
             | experience" (This maps to what Facebook / Google try to do,
             | or what internet wire tapping does).
        
             | andyferris wrote:
             | I think there's an idea akin to Europe's "right to be
             | forgotten" here.
             | 
             | We can all observe the world in the moment. Police can
             | obtain warrants to wiretap (or the digitial equivalent)
             | suspects in real-time. That's fine!
             | 
             | The objection is that we are ending up with laws and
             | rulings that require a recording of history of everyone by
             | everyone - just so the police can have the convenience of
             | trawling through data everyone reasonably felt was private
             | and shouldn't exist except transiently? Not to mention that
             | perhaps the state should pay for all this commercially
             | unnecessary storage? Our digital laws are so out-of-touch
             | with the police powers voters actually consented to - that
             | mail (physical post) and phone calls shall not be in
             | intercepted except under probable cause (of a particular
             | suspect performing a specific crime) and a judge's consent.
             | Just carry that concept forward.
             | 
             | On a technical level, I feel a "perfect forward secrecy"
             | technique should be sufficient for implementers. A warrant
             | should have a causal (and pinpoint) effect on what is
             | collected by providers. Of course we can also subpoena
             | information that everyone reasonably expected was recorded
             | (i.e. not transient and private). This matches the
             | "physical reality" of yesteryear - the police can't execute
             | a warrant for an unrecoreded person-to-person conversation
             | that happened two weeks ago; you need to kindly ask one of
             | the conversents (who have their own rights to privacy /
             | silence, are forgetful, and can always "plead the 5th").
        
         | freejazz wrote:
         | >So because the lawsuit pertains to copyright, we can ignore
         | possible constitutional issues because it'll make things take
         | longer?
         | 
         | What constitutional issues do you believe are present?
         | 
         | > There is no requirement for a lawyer to be utilized.
         | 
         | Corporations must be represented by an attorney, by law. So
         | that's not true outright. Second, if someone did file something
         | pro-se, they might get a little leeway. But the business isn't
         | represented pro-se, so why on earth would the judge apply a
         | lower standard appropriate for a pro-se party so a
         | sophisticated law firm, easily one of the largest and best in
         | the country?
         | 
         | When you are struggling to reason around really straightforward
         | issues like that, it does not leave me with confidence about
         | your other judgments regarding the issues present here.
        
           | paulddraper wrote:
           | > What constitutional issues do you believe are present?
           | 
           | 4th Amendment (Search and Seizure)
        
             | kopecs wrote:
             | Do you think the 4th amendment enjoins courts from
             | requiring the preservation of records as part of discovery?
             | The court is just requiring OpenAI to maintain records it
             | already maintains and segregate them. Even if one thinks
             | that _is_ a government seizure, which it isn't---See
             | Burdeau v. McDowell, 256 U.S. 465 (1921); cf. Walter v.
             | United States, 447 U.S. 649, 656 (1980) (discussing the
             | "state agency" requirement)---no search or seizure has even
             | occurred. There's no reasonable expectation of privacy in
             | the records you're sending to OpenAI (you know OpenAI has
             | them!!; See, e.g., Smith v. Maryland, 442 U.S. 735 (1979))
             | and you don't have any possessory interest in the records.
             | See, e.g., United States v. Jacobsen, 466 U.S. 109 (1984).
        
               | alwa wrote:
               | This would help explain why entities with a "zero data
               | retention" agreement are "not affected," then, per
               | OpenAI's statement at the time? Because records aren't
               | created for those queries in the first place, so there's
               | nothing to retain?
        
               | kopecs wrote:
               | AIUI Because if you have a zero data retention agreement
               | you are necessarily not in the class of records at issue
               | (since enterprise customers records are not affected,
               | again AIUI per platinffs' original motion which might be
               | because they don't think they're relevant for market harm
               | or something).
               | 
               | So I think that this is more so an artefact of the
               | parameters than an outcome of some mechanism of law.
        
           | strictnein wrote:
           | > When you are struggling to reason around really
           | straightforward issues like that, it does not leave me with
           | confidence about your other judgments regarding the issues
           | present here.
           | 
           | Or, perhaps, that's not something known by most. I didn't
           | struggle to understand that, I simply didn't know it. Also,
           | again, the article could have mentioned that, and I started
           | my statement by saying maybe the article was doing a bad job
           | conveying things.
           | 
           | > What constitutional issues do you believe are present?
           | 
           | This method of interrogation of online comments is always
           | interesting to me. Because you seem to want to move the
           | discussion to that of whether or not the issues are valid,
           | which wasn't what I clearly was discussing. When you are
           | struggling to reason around really straightforward issues
           | like that, it does not leave me with confidence about your
           | other judgments regarding the issues present here.
        
             | Aeolun wrote:
             | Now that you've both done it, can we stop with the ad
             | hominem?
        
       | cactusplant7374 wrote:
       | Is there any proof that ChatGPT was deleting chats? I would think
       | they would keep them to use as training data.
        
         | colkassad wrote:
         | There should be a law about what "delete" means for online
         | services. I used to delete old comments on reddit until their
         | AI caught up to my behavior and shadow-banned my 17 year-old
         | account. As soon as that happened, I could see every comment I
         | ever deleted in my history again. The only consolation is that
         | no one but me could see my history anymore.
        
           | gunalx wrote:
           | As long as you are in the eu GDPR gives you the right to be
           | forgotten.
        
             | derwiki wrote:
             | Or CCPA in California
        
               | acka wrote:
               | They're both paper tigers. The CLOUD Act and that massive
               | data center in Utah trump* each of them respectively.
               | What happens in the US stays in the US, delete button or
               | not.
               | 
               | * Deliberately ambiguous.
        
         | ianlevesque wrote:
         | This kind of conspiracy theory comes up a lot on here. Most of
         | these products have the option to allow or deny that and
         | contrary to the opinions here those policies are then followed.
         | This whole episode is news because it violates that.
        
           | akimbostrawman wrote:
           | the only "conspiracy" here is acting like ignoring privacy
           | laws and toggles is not the default for these corporation and
           | has been for over a decade.
        
       | unyttigfjelltol wrote:
       | The "judge" here is actually a magistrate whose term expires in
       | less than a year.[1]
       | 
       | Last time I saw such weak decision-making from a magistrate I was
       | pleased to see they were not renewed, and I hope the same for
       | this individual.
       | 
       | [1]
       | https://nysd.uscourts.gov/sites/default/files/2025-06/Public...
        
         | freejazz wrote:
         | I don't think a routine application of 3rd party doctrine
         | should sink a magistrate judge.
        
           | unyttigfjelltol wrote:
           | So in your view, there is no expectation of privacy in
           | anything typed into the Internet. And, if a news
           | organization, or a blogger, or whoever, came up with some
           | colorable argument to discover everything everyone types into
           | the Internet, and sued the right Internet hub-- you think
           | this is totally routine and no one should be annoyed in the
           | least, and moreover no one should be allowed to intervene or
           | move for protective order, because it would be more
           | convenient for the court to just hand over all the Internet
           | to the news or blogger or whoever.
           | 
           | It's precisely that perspective that I think _should_ sink a
           | magistrate, hundreds of times over.
        
         | cvoss wrote:
         | Not sure why scare quotes are warranted here. Your so-called
         | '"judge"' is, in fact, a judge.
         | 
         | https://www.uscourts.gov/about-federal-courts/types-federal-...
        
           | unyttigfjelltol wrote:
           | Magistrate judges are more variable, not subject to Senate
           | confirmation, do not serve for life, render decisions that
           | very often are different in character from those of regular
           | judges-- focusing more on "trees" than "forest". Without
           | consent, their scope of duties is limited and they cannot
           | simply swap in for a district judge. They actually are
           | supervised by a district judge, and appeals, as it were, are
           | directed to that officer not an appellate court.
           | 
           | In a nutshell, I used quotes to indicate how the position was
           | described by the article. These judidical officers are not
           | interchangeable with judges in the federal system, and in my
           | experience this distinction is relevant to both why this
           | person issued the kind of decision they did, and what it
           | means for confidence in the US justice system.
        
       | mac-attack wrote:
       | To me, this is akin to Google saying that they don't want to
       | follow a court-ordered law because it would be a privacy
       | invasion.. I feel like OpenAI framed the issue as a privacy
       | conversation and some news journalist are going along with it
       | without questioning the source and their current privacy policy
       | re: data retention and data sharing affiliates, vendors, etc.
       | 
       | It takes 30 seconds to save the privacy policy and upload it to
       | an LLM and ask it questions and it quickly becomes clear that
       | their privacy policy allows them to hold onto data indefinitely
       | as is.
        
       | akimbostrawman wrote:
       | fighting microsoft to get the illusion of privacy is the modern
       | day fight against windmills
        
       | notnullorvoid wrote:
       | Even if OpenAI and other LLM providers were prohibited by law not
       | to retain the data (opposite of this forced retention), no one
       | should trust them to do so.
       | 
       | If you want to input sensitive data into an LLM, do so locally.
        
         | sebastiennight wrote:
         | > prohibited by law not to retain the data
         | 
         | is the same as forced retention. You've got a double negative
         | here that I think you didn't intend.
        
       | HenryBemis wrote:
       | I remember back in the day when I made the mistake to use
       | Facebook and iPhones that a) Facebook never actually deleted
       | anything and b) iMessage was also not deleting (but both were
       | merely hiding).
       | 
       | This is why in this part of the world we have GDPR and it would
       | be amazing to see OpenAI receiving penalties for billions of
       | euros, while at the same time a) the EU will receive more money
       | to spend, and b) the US apparatus will grow stronger because it
       | will know everything about everyone (the very few things they
       | didn't already know via the FAANGS.
       | 
       | Lately I have been thinking that "they" play chess with our
       | lives, and we sleepwalking to either a Brave New World (for the
       | elites) and/or a 1984/animal farm for the rest. To give a more
       | pleasant analogy, the humans in WALL-E or a darker analogy, the
       | humans in the Matrix.
        
         | derwiki wrote:
         | CCPA also allows for you to have your data deleted
        
       | freejazz wrote:
       | "creating mass surveillance program harming all ChatGPT users" is
       | just taking the lawyers' words out of their mouth at face value.
       | Totally ridiculous. And of course its going to lead to extreme
       | skepticism from the crowd here when its put forward that way.
       | Another way to do describe this: "legal discovery process during
       | a lawsuit continues on as it normally would in any other case"
        
       | lifeisstillgood wrote:
       | We (various human societies) do need to deal with this new
       | ability to surveil every aspect of our lives. There are clear and
       | obvious benefits in the future - medicine, epidemiology will have
       | enormous reservoirs of data to draw on, entire new fields of mass
       | behavioural psychology will come into being (I call it MAssive
       | open online psychology or moop) and we might even find
       | governments able to use minute by minute data of their citizens
       | to you know, provide services to the ones they miss...
       | 
       | But all of this assumes a legal framework we can trust - and I
       | don't think this comes into being piecemeal with judges.
       | 
       | My personal take is that data that, without the existence of
       | activity of a natural human, data that woukd not exist or be
       | different must belong to that human - and that it can only be
       | held in trust without explicit payment to that human if the data
       | is used in the best interests of the human (something something
       | criminal notwithstanding)
       | 
       | Blathering on a bit I know but I think "in the best interests of
       | the user / citizen is a really high and valuable bar, and also
       | that by default, if my activities create or enable the data,it
       | belongs to me, really forces data companies to think.
       | 
       | Be interested in some thoughts
        
         | meowkit wrote:
         | Zero knowledge proofs + blockchain stream payments + IPFS or
         | similar based storage with encryption and incentive mechanisms.
         | 
         | Its still outside the overton window (especially on HN), but
         | the only way that I've seen where we can get the benefits of
         | big data and maintain privacy is by locking the data to the
         | user and not aggregating it in all these centralized silos that
         | then are incentivized to build black markets around that data.
        
           | warkdarrior wrote:
           | How do you apply ZKPs to ChatGPT queries?
        
             | NoImmatureAdHom wrote:
             | As far as cryptographic solutions go: what would be ideal
             | is homomorphic encryption, where the server can do the
             | calculations on data it can't decrypt (your query) and send
             | you something back that only _you_ can decrypt. Assuming
             | that 's unworkable, we could still have anonymity via
             | cryptocurrency payments for tokens (don't do accounts) +
             | ipfs or tor or similar. You can carry around your query +
             | answer history with you.
        
       | tptacek wrote:
       | This doesn't seem especially newsworthy. Oral arguments are set
       | for OpenAI itself to oppose the preservation order that has
       | everyone so (understandably) up in arms. Seems unlikely that two
       | motions from random ChatGPT users were going to determine the
       | outcome in advance of that.
        
         | gridspy wrote:
         | Seems that a judge does not understand the impact of asking
         | company X to "retain all data" and is unwilling to rapidly
         | reconsider. Part of what makes this newsworthy is the impact of
         | the initial ruling.
        
       | Akranazon wrote:
       | > Judge denies creating "mass surveillance program" harming all
       | ChatGPT users
       | 
       | What a horribly worded title.
       | 
       | A judge rejected the creation of a mass surveillance program?
       | 
       | A judge denied that creating a mass surveillance program harms
       | all ChatGPT users?
       | 
       | A judge denied that she created a mass surveillance program, and
       | its creation (in the opinion of the columnist) harms all ChatGPT
       | users?
       | 
       | The judge's act of denying resulted in the creation of a mass
       | surveillance program?
       | 
       | The fact that a judge denied what she did harms all ChatGPT
       | users?
       | 
       | (After reading the article, it's apparently the third one.)
        
         | elefanten wrote:
         | The third one is the only correct way to interpret the title.
        
       | xbar wrote:
       | "Judge creates mass surveillance program; denies it."
        
         | o11c wrote:
         | I _really_ wish we had broad laws requiring telling the truth
         | about concrete things (in contracts, by government officials,
         | etc.). But we don 't even have any real enforcement even for
         | blatant perjury.
        
       | delusional wrote:
       | It's crazy how much I hate every single top level take in this
       | thread.
       | 
       | Real human beings actual real work is allegedly being abused to
       | commit fraud at a massive scale, robbing those artist of the
       | ability to sustain themselves. Your false perception of intimacy
       | while asking the computer Oracle to write you smut does not trump
       | the fair and just discovery process.
        
       | devmor wrote:
       | I find it really strange how many people are outraged or shocked
       | about this.
       | 
       | I have to assume that they are all simply ignorant of the fact
       | that this exact same preservation of your data happens in every
       | other service you use constantly other than those that are
       | completely E2EE like signal chats.
       | 
       | Gmail is preserving your emails and documents. Your cell provider
       | is preserving your texts and call histories. Reddit is preserving
       | your posts and DMs. Xitter is preserving your posts and DMs.
       | 
       | This is not to make a judgement about whether or not this should
       | be considered acceptable, but it is the de facto state of online
       | services.
        
       ___________________________________________________________________
       (page generated 2025-06-23 23:00 UTC)