[HN Gopher] Dramatic growth in mental-health apps has created a ...
___________________________________________________________________
Dramatic growth in mental-health apps has created a risky industry
Author : axiomdata316
Score : 24 points
Date : 2021-12-20 22:05 UTC (54 minutes ago)
(HTM) web link (www.economist.com)
(TXT) w3m dump (www.economist.com)
| rbartelme wrote:
| https://archive.md/pog41
| Pigalowda wrote:
| From the article:
|
| "No universal standards for storing "emotional data" exist. John
| Torous of Harvard Medical School, who has reviewed 650 mental-
| health apps, describes their privacy policies as abysmal. Some
| share information with advertisers. "When I first joined
| BetterHelp, I started to see targeted ads with words that I had
| used on the app to describe my personal experiences," reports one
| user"
|
| >I could drum up a dark fantastical setting of HR departments
| buying this data and screening candidates with it. Hopefully this
| patient data gets HIPAA treatment and is more protected.
| ImaCake wrote:
| The problem is that there is no requirement for these apps to
| actually _work_. They might work, or might not. But that doesn't
| really impact whether scientists get grants. They make the
| funders /government look like they care about mental health and
| these apps are _cheap_ compared to more traditional mental health
| and psychiatric therapies. The scientists are desperate for the
| next round of grants funding and are happy to produce something
| that makes the government funding them look good.
|
| This kind of thing is pretty common in science. It can be hard to
| see from the outside, but it is so obvious once you have been
| involved in grant writing.
| eganist wrote:
| disclosure: I hold an equity position in an end to end encrypted,
| trust on first use communications app, not really relevant to
| this conversation so no value in naming them.
|
| I'm not terribly surprised at the number of services that have
| popped up in the last few years, but considering the sensitivity
| of the data (in many cases, info that most would consider to be
| some of our deepest feelings on a topic that we wouldn't want
| just anyone to hear about), I'm surprised at the lack of ventures
| working to distance themselves from the content of the
| conversations. It just feels like in this specific niche,
| protection of data would be a much bigger selling point than most
| others.
|
| I'd imagine any suitably well-funded chat service with end to end
| encryption could fund a subsidiary venture to build out a
| therapy-focused implementation of such, and especially in the
| world of therapy, hand-written notes would probably be preferred
| anyway, so there shouldn't be much in the way of storage of data
| aside from directory data for individual users of the service and
| probably card transaction pointers to a third party processor,
| plus a handful of other bits that still shouldn't be anywhere
| near as compromising as actual _contents of conversation._
|
| Open to discussion on the topic. There's probably an angle I'm
| not seeing.
| rbartelme wrote:
| I think it might depend on whether or not U.S. insurance
| companies are involved. If so HIPAA is a real liability as a
| company. For example, a former colleague worked at a high
| profile private research facility that had their HIPAA firewall
| breached and their entire department was fired. Assuming that
| chat logs on these apps could fall into that same class of data
| in the U.S., that's a huge amount of risk/liability as a
| company.
___________________________________________________________________
(page generated 2021-12-20 23:00 UTC)