[HN Gopher] Tell HN: Otter.ai bot recording meetings without con...
       ___________________________________________________________________
        
       Tell HN: Otter.ai bot recording meetings without consent
        
       I occasionally use Otter.ai to transcribe when I'm multitasking.
       Recently they made an update, which I carefully opted out of, to
       automatically join every meeting through my Google Calendar and
       transcribe it. Screenshots prove I had the feature disabled.  The
       bot proceeded to join two confidential meetings on my behalf and
       record the whole thing, then email every member an absurd,
       inaccurate "outline" after.  I am not much of a privacy person but
       I feel completely abused in this situation. I have opened a support
       ticket with screenshots but there is no response, and according to
       Twitter they are essentially not reviewing tickets from free users
       at the moment.  So just a heads-up to the HN community!  Are there
       other, more privacy oriented transcription services anyone can
       recommend?
        
       Author : arcticfox
       Score  : 473 points
       Date   : 2022-09-07 14:27 UTC (8 hours ago)
        
       | bombcar wrote:
       | The big names provide recording and transcription now - Zoom has
       | it, Teams has it (Teams apparently does it _realtime and live_
       | and it 's not bad, great for the deaf!).
       | 
       | I suppose those are part of paid plans, and they trigger the
       | "your shit is being recorded, dude." warnings.
        
         | theguyovrthere wrote:
         | Webex...
        
         | pbreit wrote:
         | That's not the news. The news is that otter.ai did it
         | automatically and even though the user had it disabled.
        
         | croes wrote:
         | Don't the need the permission of all participants to
         | transcribe?
        
           | nvr219 wrote:
           | No you do not - at least not technically! I had to look into
           | this at work. I looked at WebEx, Teams, and Zoom - all three
           | let auto-transcribe just roll. I think on one of them it
           | gives a quick popup saying it's transcribing but no consent
           | required. Contrast with audio/video recording where Zoom lets
           | you either consent or leave the meeting. I asked our legal
           | counsel and they said it's kind of iffy whether live auto-
           | transcription counts as recording - didn't seem like a
           | settled matter.
        
             | smachiz wrote:
             | In Teams at least, it's called transcription, but really
             | it's captioning. It's not recorded unless you also turn on
             | recording.
        
               | bombcar wrote:
               | Can you cut the text out of the caption window before
               | leaving? The one I saw it was appearing in chat (I
               | think?).
        
               | smachiz wrote:
               | Maybe? But you can also point a camera at your screen and
               | record the whole thing...or do it in software with
               | screen/audio capture.
               | 
               | The point is the _app_ isn 't recording without telling
               | people just to do captioning.
               | 
               | No app prevents bad actors from recording.
        
             | PeterisP wrote:
             | The big difference here is that in such a scenario all of
             | the participants explicitly use the Teams or Zoom and
             | they're technically "getting notified about recording" in
             | the terms and conditions of Teams or Zoom telling that
             | Microsoft or Zoom is getting the recordings which they got
             | when starting to use Teams or Zoom. However, in that bot
             | scenario, none of the other participants are users of
             | otter.ai, have no relationship with them and so can't grant
             | any permissions.
        
           | TylerE wrote:
           | Depends on the jurisdiction. In many states you only need 1st
           | party consent to record a call. If you're ON the call (as
           | opposed to say, wire taping it...) you're your own first
           | party.
        
           | resoluteteeth wrote:
           | I think as long as you notify people and give them the
           | opportunity to disconnect that's generally considered
           | consent, as with 800 numbers where they tell you calls are
           | recorded for quality assurance or whatever.
        
             | noasaservice wrote:
             | The company who predominantly works in that space is called
             | "CallMiner".
        
         | bvm wrote:
         | Zoom does realtime too
        
         | dleslie wrote:
         | Slack even has it.
        
         | extra88 wrote:
         | Zoom's live and post-meeting transcription happens to be done
         | using licensed Otter.ai code.
        
       | klyrs wrote:
       | Depending on where you live, that might violate wiretapping laws.
       | Most of North America requires at least one party consent.
        
         | jwandborg wrote:
         | It could also not count as wiretapping, since it's about
         | transcription, not about audio recordings.
         | 
         | If the transcription service's presence in the call isn't
         | hidden from other participants, is it still wiretapping?
        
           | klyrs wrote:
           | I'll be damned, that appears to be a huge bloody loophole.
           | 
           | https://www.legaltranscriptionservice.com/wire-tap-
           | transcrip...
        
         | MrWiffles wrote:
         | I'm not in agreement with this idea, but i wonder if these
         | companies could legally be considered a participant and
         | therefore the one party consenting...
        
           | [deleted]
        
           | bigdict wrote:
           | Interesting thought, that's something that could be buried
           | deep in a EULA.
        
           | dehrmann wrote:
           | It probably counts as one-party consent, but not two.
           | California requires two.
        
             | [deleted]
        
             | wnoise wrote:
             | Although the standard description is "two-party", it's
             | almost always really "all-party".
        
       | _narendra_ wrote:
       | > Are there other, more privacy oriented transcription services
       | anyone can recommend?
       | 
       | Everyone seems to be concerned about otter.ai and their bad
       | practices. Can someone please answer this question as well?
        
       | ricwo wrote:
       | > Are there other, more privacy oriented transcription services
       | anyone can recommend?
       | 
       | Still early days, but we're working on a privacy-first solution
       | (cogram.com). We're looking for beta testers at the moment.
       | 
       | Email us at founders@cogram.com if you're interested!
        
       | rwhitman wrote:
       | I rely on Otter pretty heavily these days. 100% agree that
       | recording and sending the transcript to meeting attendees without
       | their knowledge is a really bad move from several angles.
       | 
       | I work sales calls set up with Calendly and Otter joins them all.
       | These are very technical so normally it's fantastic - EXCEPT if
       | we start talking early or the prospect doesn't follow the invite
       | and never joins the call, then they would get a transcript of my
       | team's chatter. I learned to not allow Otter to join the call
       | until everyone is in attendance
       | 
       | Whats more frustrating is that you can't disable this auto-email
       | "feature" unless you are on a business plan of some sort. But I
       | have a paid plan through the iPhone app and apparently can't
       | convert it to a business plan associated with my company. So no
       | good way to disable it
       | 
       | I get the network effect of referral business but sharing private
       | conversations without consent is not the way to achieve it.
        
       | fxtentacle wrote:
       | I would love to recommend it already, but my self-built privacy
       | oriented real-time transcription service isn't ready for release
       | yet. You can join my email waiting list [1], though.
       | 
       | I tried to discover something that would stand the scrutiny of a
       | German "Ausschreibung", which is a government call for bids. They
       | require things to be GDPR-compliant by law, and they recently
       | disqualified companies for using on-prem solutions by US Cloud
       | providers with the argument that if it's a US company, the NSA
       | can still force them to disclose GDPR-protected data. I reviewed
       | all offerings I could find on the market. I couldn't find
       | anything good, which is when I decided to build my own.
       | 
       | For what it's worth, I also published a paper on improving German
       | ASR already [4].
       | 
       | In my opinion, the only 100% offline and, hence, 100% private
       | solutions that are ready to use right now are VOSK and Coqui STT.
       | But their recognition quality is atrocious. Like pre-2016 word-
       | error-rate on LibriSpeech / CommonVoice. Then there's
       | Scribosermo, which is converted NVIDIA models, so it comes with a
       | lot of rules for how you're allowed to use it, and quality is
       | still worse than all US clouds and it's quite slow, like 2s
       | delay.
       | 
       | In addition to that, I could only find companies who more or less
       | openly re-sell the cloud AI solutions from Microsoft, Amazon,
       | Google, or Alibaba. But that won't fly for German government bids
       | [2].
       | 
       | I believe your best option is to wait for me ;) to finish my
       | local tool [1] which will have a WebRTC server so that you can
       | just script it however you want :) And your second best option is
       | probably to use HuggingFace Transformers with Facebook's pre-
       | trained base models [3].
       | 
       | [1]
       | https://madmimi.com/signups/f0da3b13840d40ce9e061cafea6280d5...
       | [2]
       | https://gdprhub.eu/index.php?title=VK_Baden-W%C3%BCrttemberg...
       | [3] https://huggingface.co/blog/fine-tune-wav2vec2-english [4]
       | https://paperswithcode.com/paper/tevr-improving-speech-recog...
        
       | propogandist wrote:
       | AI bot covertly joins company meetings to record conversations to
       | monetize freemium users.
        
       | fudged71 wrote:
       | I have been a user of Grain for a while, and I appreciate that it
       | auto-joins all of my meetings. However, in a couple of situations
       | it has been with non-tech people who were absolutely confused
       | "what this is" and I didn't see an easy way to disable ahead of
       | time for specific meetings or specific participants.
       | 
       | These tools need a lot more UX work considering the sensitivity
       | of the outcomes.
        
       | Bakary wrote:
       | Is there such a thing as a simple audio transcribing system that
       | can turn a recording into a text file? I have this problem where
       | I sometimes forget a sentence someone JUST told me. And often my
       | notes aren't sufficient compared to a verbatim recording because
       | I'm liable to forget important contextual info very rapidly. But
       | dealing with raw audio files is a PITA.
        
         | lathiat wrote:
         | macOS added this to the OS
         | 
         | https://www.maketecheasier.com/enable-live-captions-ios-maco...
         | 
         | Google meet also has it built in on the client.
         | 
         | Edit: previously incorrectly linked to
         | https://support.apple.com/en-au/guide/mac-help/mchlc1cb8d54/...
        
           | martimarkov wrote:
           | That's not transcribing. That's just a feature turning CC on
           | and off in a unified way, no?
        
             | lathiat wrote:
             | You're right that's the wrong link. It's part of macOS
             | Ventura (not yet released)
             | 
             | Details: https://www.maketecheasier.com/enable-live-
             | captions-ios-maco...
        
         | kevmo314 wrote:
         | Turn on closed captioning?
        
           | Bakary wrote:
           | The more uncomfortable truth is that I'd rather be able to
           | record on a separate device because I will get in trouble for
           | it if it comes to light. It is legal in my region, but people
           | will be (very rightfully) creeped out by it and are unlikely
           | to ever buy my short term memory sob story.
           | 
           | OP mentioned multitasking so maybe they had a similar version
           | of this problem. People aren't going to be happily recorded
           | just for the purpose of someone wanting to preserve their own
           | energy during meetings.
        
             | closewith wrote:
             | > It is legal in my region, but people will be (very
             | rightfully) creeped out by it and are unlikely to ever buy
             | my short term memory sob story.
             | 
             | Okay, so maybe it's best if you don't do it then?
        
         | resoluteteeth wrote:
         | The VOSK library is good for transcription but there seems like
         | a lack of good simple command line frontends for offline
         | transcription of audio files.
         | 
         | mp4grep works and I've been using it but it has some
         | unnecessary features if this is all you want to do (it's mainly
         | designed to cache the transcriptions and let you search them
         | rather than just write them to a text file) and hopefully
         | someone will make a simpler command line transcription tool.
        
           | password4321 wrote:
           | https://alphacephei.com/vosk/install#usage-examples
           | demonstrates the bare-bones vosk-transcriber sample, and
           | there's also https://www.assemblyai.com/blog/getting-started-
           | with-espnet
           | 
           | I wasn't able to play with
           | https://github.com/o-oconnell/mp4grep on ARM.
        
       | andrewmutz wrote:
       | Tangential question: I know that wiretapping laws in many states
       | require everyone to be notified if a recording is being made.
       | Does anyone know if the same applies to a transcription?
        
       | qskousen wrote:
       | My work just started using https://fireflies.ai/ , no affiliation
       | but it seems to work alright.
        
       | garysahota93 wrote:
       | This is why I use Clari Wingman instead.
        
       | [deleted]
        
       | cedricd wrote:
       | I've been liking https://fathom.video a lot. Integrates well with
       | Zoom. I think you can choose which recordings are shared with
       | teams -- and their transcription is pretty good.
        
       | sorokod wrote:
       | "I am not much of a privacy person but I feel completely abused
       | in this situation"
       | 
       | On the plus side you have been given a valuable lesson.
       | 
       | For example: you may be more of a privacy person that you
       | thought, you may want to be more careful about sharing your data.
       | You may want to consider if you are paying for a service or not
       | makes any difference in the way your data is handled.
        
         | warkdarrior wrote:
         | Since tracking and selling of data, in general, is free revenue
         | on top of any service fees, it seems to me that successful
         | companies will over time trend towards service fees AND
         | tracking/selling personal data.
        
           | sorokod wrote:
           | I agree except that such company need not be successful nor
           | do this over time.
           | 
           | Every day a company is not selling your data, it is "leaving
           | money on the table" - a mortal sin.
        
         | leaflets2 wrote:
         | And we others here too? :-)
        
       | mrsnowman123 wrote:
        
       | jharohit wrote:
       | https://www.avoma.com/ we use for sales. very tight controls
        
       | mikeryan wrote:
       | Otter has had some issues in the past around their practices.
       | It's used a lot by reporters.
       | 
       | https://www.politico.com/news/2022/02/16/my-journey-down-the...
        
       | user3939382 wrote:
       | Is anyone here a lawyer that can explain the ramifications of
       | this in all party consent states?
        
       | closewith wrote:
       | > I am not much of a privacy person but I feel completely abused
       | in this situation.
       | 
       | This is part of the reason why it's important to have strong
       | privacy protections, even if it doesn't feel like privacy issues
       | have impacted you personally. Eventually everyone has their
       | private data abused.
        
       | karanmg wrote:
       | > Are there other, more privacy oriented transcription services
       | anyone can recommend?
       | 
       | Yes. Try https://aliceapp.ai, the iOS app's in the AppStore:
       | https://aliceapp.ai/app. I created this specifically to be as
       | privacy conscious as possible. There's a small, but strong team
       | of engineers behind this and I'm the primary investor. It's not
       | perfect, but works pretty well, with many relying on it everyday.
       | 
       | From the FAQs: https://aliceapp.ai/faqs
       | 
       | Is Alice secure?
       | 
       | * We don't ask for your name.
       | 
       | * We don't require your real email address, nor your phone
       | number.
       | 
       | * We don't use passwords to login, to avoid easy or re-usable
       | passwords. The login process is Two-Factor Authentication (2FA)
       | by design.
       | 
       | * We don't track your location.
       | 
       | * We don't ask for access to your contacts.
       | 
       | * We don't ask you to allow push notifications.
       | 
       | * We don't store your credit card information in our database.
       | 
       | * We don't use tools like Google Analytics to track your
       | behavior.
       | 
       | * We don't drop any cookies from Twitter, Facebook, Instagram, et
       | cetera (i.e. any social network) to track browsing habits.
       | 
       | * We aren't on social networks. Our focus is on the product and
       | on communicating directly with you via email, text or phone. We
       | are not into the game of getting "likes".
       | 
       | * We don't prompt you to give us a five-star rating on the App
       | Store.
       | 
       | * We don't annoy you with newsletter signup popups.
       | 
       | * We don't serve you ads.
       | 
       | * Alice is only listening when it's obviously recording.
       | Otherwise the mic is off by default.
        
         | warkdarrior wrote:
         | $3/hour or more!??! That is insane. Anything more than $1/month
         | is a rip-off.
        
           | fxtentacle wrote:
           | Google Cloud is priced at $2.16/h if you want to request
           | privacy.
        
         | [deleted]
        
         | fxtentacle wrote:
         | Can you say anything about the technology? It appears that this
         | app DOES NOT work fully offline? That would imply that it sends
         | all audio recordings to someone's data center...
         | 
         | Who are you sending the audio to?
         | 
         | Which legal jurisdiction are they in?
        
         | [deleted]
        
         | anigbrowl wrote:
         | It's unfortunate you were downvoted. Perhaps it's because
         | you're self-promoting, but you were up front about it and your
         | pitch was relevant and focused on user needs.
        
           | karanmg wrote:
           | Thank you.
           | 
           | It is up to us to do something about making our software more
           | privacy conscious. And management/leaders/founders have to
           | put up the money to support it.
           | 
           | This project is also an experiment to see if its possible to
           | build a "successful" app without user behavior tracking,
           | without endless AB testing.. and so far it seems totally
           | possible.
        
             | anigbrowl wrote:
             | I forgot to add that while I'm not an iPhone user (so can't
             | test if it does this already) but if you can set it up to
             | record phone calls it would be an absolute godsend for
             | journalistic purposes.
        
               | karanmg wrote:
               | Would love to add phone call recording, with proper user
               | permissions. Unable to do so with current iOS APIs
               | though. Open to ideas here.
               | 
               | Do have desktop recording on the website to capture audio
               | if the call is on speaker. Some users use that.
               | 
               | Adding another number to the call seems inelegant.
        
       | neilv wrote:
       | I don't know whether this is bumping into a complicated legal
       | area of audio recording consent, but your company might want to
       | ask a lawyer (about any wrongs to the company, any obligations
       | the company has, etc.).
       | 
       | https://en.wikipedia.org/wiki/Telephone_call_recording_laws#...
       | 
       | If you're not the head of the company, you could raise it with a
       | higher level of management, or ask in-house counsel.
        
         | ShakataGaNai wrote:
         | IANAL But transcriptions of calls don't GENERALLY run into the
         | multi-party consent state laws, because the concept of a non-
         | human listening/transcribing a call didn't exist when most of
         | those laws were created. So if the service doesn't keep a copy
         | of the recording (they just transcribe on the fly), then it's
         | probably legal.
         | 
         | That being said, most services do announce themselves in
         | someway to cover the legal grey areas. Plus it's the right
         | thing to do ethically.
        
       | Otter_ai wrote:
       | Hi there! With our new repackaging, Basic users are getting more
       | features, including Otter Assistant, which has the ability to
       | auto-join meetings and auto-share notes connected to users'
       | Google or Outlook Calendars. Users have full control of whether
       | their Otter Assistant joins meetings and shares notes with
       | calendar event guests. For maximum automation, users can turn on
       | this toggle to invite Otter Assistant to join all meetings by
       | default. At anytime, you can change this default setting. Users
       | can also adjust individual auto-join and auto-share settings on
       | the calendar on the Otter home page as needed, overriding any
       | defaults settings. You can find more information on how to use
       | Otter Assistant here - https://help.otter.ai/hc/en-
       | us/articles/4425393298327
        
         | tdeck wrote:
         | I understand people's reaction in downvotoing this impersonal
         | canned response, but I think it's worth keeping it visible for
         | other readers to understand how Otter chooses to officially
         | respond to the complaint.
        
         | arcticfox wrote:
         | Holy cow, I'm impressed that you bothered to respond here and
         | yet did not respond to the part where I _very carefully_ opted
         | out of this when it was offered. My coworker was already
         | confused how this happened to her, so I knew what to look for
         | and yet this still happened.
         | 
         | When I opened Otter there was a large splash screen that said
         | "Enable Otter Assistant" in a big blue button - I looked around
         | and found "Skip" in gray in the upper right. My settings show
         | Otter Assistant is off, and I never touched them.
        
         | comboy wrote:
         | Your reaction to the post tells me more about the company than
         | the post itself.
         | 
         | And the sad reality is that while some users from this thread
         | may never touch it, it's likely that nobody at the company will
         | ever suffer any consequences, and the whole incident may be EV+
         | because more users join after hearing about it than leaving.
        
         | fmntf wrote:
         | That's a nice politically correct statement. Now, would you
         | mind to answer the OP?
        
         | mattwest wrote:
         | Seems like an insidious default setting.
        
           | elliekelly wrote:
           | OP said they have the setting toggled off. Regardless whoever
           | decided to make this "feature" opt _out_ rather than opt _in_
           | made a terrible decision.
        
         | smashah wrote:
         | Sounds like some copywriting/SEO ai wrote this ridiculous
         | response.
        
           | Ueland wrote:
           | Yeah, they are literally just spamming the same everywhere:
           | https://twitter.com/otter_ai/status/1567568010680111104
           | 
           | Talk about dumpster fire
        
       | aaaaaaaaaaab wrote:
       | >I am not much of a privacy person but I feel completely abused
       | in this situation.
       | 
       | Oh, most people aren't either. Until they get burned. Then
       | suddenly privacy becomes a major concern for them.
       | 
       | I feel there's a nugget of wisdom in here for you!
        
       | tchock23 wrote:
       | Does anyone know of a good app for voice journaling?
       | 
       | I use Otter for that purpose exclusively, and between this news
       | and their recent UI changes to make the record/stop button tiny
       | on their iOS app I'd like to seek an alternative.
       | 
       | I use DayOne for written journals and know it offers voice
       | capabilities, but the limits are pretty short. I'd like something
       | more akin to Otter.
        
       | encryptluks2 wrote:
       | Employee installs something on company computer and gives access
       | to company content. Program does something nefarious but employee
       | innocent.
        
         | 6644logan wrote:
         | Um, yes? That's how it works with _literally every piece of
         | software_ that have any permissions locally. What if Google
         | Chrome decided to do the same thing unilaterally?
        
           | chucksmash wrote:
           | Some security conscious organizations will ban all external
           | software and then approve exceptions on a case by case basis
           | as they pop up.
           | 
           | In such a place, someone has already long ago submitted a
           | ticket to have Google Chrome blessed and the employee
           | installing Chrome is following regs.
        
             | arcticfox wrote:
             | Yes, in my case the CEO recommended we use Otter to keep
             | all of the meetings on our dozens of projects straight.
             | 
             | It's not that we didn't want to use it sometimes, it's that
             | I had no desire to use it for these times and it went
             | completely rogue. That could happen with any software, even
             | an OS itself.
        
       | that_guy_iain wrote:
       | > I am not much of a privacy person but I feel completely abused
       | in this situation. I have opened a support ticket with
       | screenshots but there is no response, and according to Twitter
       | they are essentially not reviewing tickets from free users at the
       | moment.
       | 
       | Important thing here is, you're not a customer.
       | 
       | > Are there other, more privacy oriented transcription services
       | anyone can recommend?
       | 
       | Probably not, since this is probably a bug that can affect any of
       | these applications. And privacy focused apps are a niche and a
       | niche that isn't that much fun to develop. The nature of these
       | apps lean towards people who aren't privacy focused since you
       | would be allowing a third party to listen to calls which is the
       | opposite of privacy. You don't need a more privacy focused app,
       | you just need another app and probably to become a customer so
       | you can get the support you clearly want.
        
         | bogwog wrote:
         | > Important thing here is, you're not a customer.
         | 
         | That's not the important thing at all. The important thing is
         | the Otter.ai is violating the privacy of its users, potentially
         | even breaking laws in some jurisdictions.
         | 
         | If anything, it raises some more red flags that they're
         | ignoring consent for free users, since that could be
         | interpreted as an attempt to monetize them by collecting data
         | (again, without consent)
         | 
         | If you don't want non-paying users, then don't offer a free
         | tier. You don't get a free pass to do whatever you want to a
         | user just because they aren't using your paid plan.
        
           | noasaservice wrote:
           | Yeah, gotta "love" the 'we can break the law cause you didn't
           | pay' HN philosophy.
           | 
           | I would derive great enjoyment of them in court in a 2 party
           | state telling a judge "well since they weren't really a
           | customer....."
        
             | that_guy_iain wrote:
             | In a 2 party state the user would be the one breaking the
             | law. They installed the software...
             | 
             | If you install recording software, you're liable for what
             | it records. Bug or not bug, the liablity is yours. The only
             | way you would get Otter.ai in a courtroom would be a civil
             | case and then the not being a paying customer therefore no
             | expectations of a warranty applies.
             | 
             | Seriously, this thread is "I installed recording software
             | and it worked on things I didn't think it would. This
             | company is terrible." while most of us work in tech and
             | know that bugs exist and have generally written some bugs.
             | Worse case, it's a bug. The solution is to stop using it
             | and move on. Or pay for support and have them fix it.
        
               | ImPostingOnHN wrote:
               | > _If you install recording software, you 're liable for
               | what it records_
               | 
               | not if it's recording without your consent
               | 
               | > _Seriously, this thread is "I installed recording
               | software and it worked on things I didn't think it
               | would._
               | 
               | a poor attempt to reframe the thread, which is actually
               | about a company monitoring confidential meetings without
               | consent
               | 
               | if affirmative consent is such a non-issue here, why
               | couldn't the company get it?
        
               | noasaservice wrote:
               | TBH, the pro-business blatantly illegal hot takes here on
               | HN are expected.
               | 
               | Because some company figured out a 'hack' to make more
               | money, somehow turns from illegal to "legal". And so many
               | users here will defend that.
               | 
               | In this case, it's violating interstate wiretapping laws
               | and state-based 2 party consent laws. But.. they just
               | squint hard and go "Hey we can violate the law and make
               | MORE money".
               | 
               | Uber is exactly that. So is AirBNB. So is Lime/Bird. Just
               | go look at much of the tech companies, and it's "Offline
               | company + way to break law to get money +
               | ONLINE!!!!1!!1!"
        
               | sergefaguet wrote:
               | Change by definition requires breaking established rules
               | and is how societies grow. Normal part of the process.
        
         | 6644logan wrote:
         | > You don't need a more privacy focused app, you just need
         | another app and probably to become a customer so you can get
         | the support you clearly want.
         | 
         | A local transcription, manually triggered service would solve
         | the concern I think.
         | 
         | > You don't need a more privacy focused app, you just need
         | another app and probably to become a customer so you can get
         | the support you clearly want.
         | 
         | I think there's a difference between free users with a "hey, my
         | app is broken" complaint and "hey, you just did something
         | incredibly wrong" complaint. It would be like if free Dropbox
         | started uploading random files that it wasn't allowed to in the
         | settings.
        
           | that_guy_iain wrote:
           | > I think there's a difference between free users with a
           | "hey, my app is broken" complaint and "hey, you just did
           | something incredibly wrong" complaint. It would be like if
           | free Dropbox started uploading random files that it wasn't
           | allowed to in the settings.
           | 
           | The thing is, you would need to review each of the complaints
           | to know which is which. If you're providing no support and
           | because you're not taking money for the service there is no
           | real liability (depending on the contract) being a non-
           | customer would mean, if they do something incredibly wrong
           | you need to get a new provider.
           | 
           | If you're not a customer, you can't expect support. If you
           | want support, pay $10 and then fill the ticket. It just
           | annoys me when people complain about not getting support when
           | they've not paid for anything.
        
             | mackmgg wrote:
             | > because you're not taking money for the service there is
             | no real liability
             | 
             | Just because you're not taking money doesn't mean you can
             | break the law without liability! Even outside of GDPR/CCPA,
             | most states require all parties to consent in recording and
             | every state requires at least one party to consent in
             | recording.
             | 
             | I do agree that this is the kind of service worth paying
             | for if you want privacy and I'm not a lawyer so I can't say
             | that this is actually illegal, but if you're ignoring not
             | just "hey this is incredibly wrong" tickets but "hey this
             | is illegal and I'm warning you before I file an official
             | complaint" that seems like it will end poorly.
        
             | rolph wrote:
             | the whole purpose of taking complaints is to...review each
             | one of them to get a view on what remediation is required
             | of your service.
        
             | 6644logan wrote:
             | > If you're not a customer, you can't expect support. If
             | you want support, pay $10 and then fill the ticket.
             | 
             | I think the support ticket was only mentioned in the sense
             | that a journalist would: "we contacted the company; they
             | did not respond to the allegations."
        
         | [deleted]
        
         | sp332 wrote:
         | How are they going to upsell people to paid tiers if the free
         | product is malfunctioning?
        
           | that_guy_iain wrote:
           | Let's be serious, if the bug exists it'll exist on all
           | versions. So they'll fix it. But they'll be responding to
           | customers and not resource drains.
        
       | mschuster91 wrote:
       | > I have opened a support ticket with screenshots but there is no
       | response, and according to Twitter they are essentially not
       | reviewing tickets from free users at the moment.
       | 
       | Are you living somewhere covered by GDPR (=EU, UK, nordic states)
       | or in California? If yes, complain at the data protection agency.
        
       | znpy wrote:
       | You can probably get a lawyer and bring otter.ai to court, to pay
       | you good money.
       | 
       | That will surely make them stop, or at least seriously reconsider
       | their strategy.
        
       | tlb wrote:
       | Tangential, but I often wish I had screenshots to prove I'd opted
       | in, or out, or otherwise done the right thing. It shouldn't be
       | too expensive to record and index the screen whenever I'm doing
       | something important online and keep it for a week.
        
         | password4321 wrote:
         | Take more screenshots
         | https://news.ycombinator.com/item?id=32215277 20220724
         | 
         | > _record my screen with OBS_ (https://obsproject.com/)
         | 
         | >> _1080p in 10fps might be enough and it won 't take
         | ridiculous amount of space_ (ffmpeg de-dupe afterward:
         | https://news.ycombinator.com/item?id=32215277#32223240)
         | 
         | --
         | 
         | > _The space requirements can be very low capturing something
         | like writing code_ (ffmpeg low fps:
         | https://news.ycombinator.com/item?id=32215277#32235012)
         | 
         | --
         | 
         | > (Mac shell command & AppleScript:
         | https://news.ycombinator.com/item?id=32215277#32219314)
         | 
         | --
         | 
         | Other OSS recommendations (there are a couple good offline Mac
         | software recommendations as well):
         | 
         | https://getsharex.com - Windows
         | 
         | > _You can also configure sharex to run tesseract ocr locally
         | on the images_
         | 
         | https://github.com/soruly/TimeSnap - Windows, archived
         | 
         | https://github.com/wanderingstan/Lifeslice - Mac
         | 
         | https://flameshot.org
         | 
         | https://tropy.org - organize images
        
         | cortesoft wrote:
         | Would a screenshot really prove anything, though? It would be
         | trivial to modify
        
           | KingOfCoders wrote:
           | When working a large enterprise, legal asked technology if we
           | could create and store a screenshot of every email going out
           | for evidence in court.
        
             | dsr_ wrote:
             | Did you explain cryptographic signing to them?
        
               | KingOfCoders wrote:
               | How would you show the content and layout of an email
               | with cryptographic signing?
               | 
               | You could sign the screenshot with a timestamp (out of my
               | league though to get that right) from an untamperable
               | source to show it hasn't been tampered with. Did you mean
               | that?
        
               | dsr_ wrote:
               | Every day, generate an archive of all the email you
               | received. Encrypt it, sign it, and generate a hash. Put
               | all the hashes on a public page.
               | 
               | Write the procedure up, and have a VP attest that they
               | ordered this done and have audited a random sample to
               | ensure that it was done to spec. Get the attestation
               | notarized. Repeat once a quarter or so. If the VP moves
               | on or dies, make sure the new VP is on board immediately.
               | 
               | No need to have a screenshot, it's email. When a court
               | requires you to show evidence, you bring in the VP, the
               | notarized statement, a copy of the code, and the
               | encrypted and unencrypted archive for that day.
        
               | KingOfCoders wrote:
               | I don't know if saving the email with all inline images
               | and downloading all external images, having the mail
               | clients to render this as it was at the time, packaging
               | this up, etc. is easier than taking a screenshot (like
               | E.g. Litmus does) E.g. for offers in an image.
        
           | tlb wrote:
           | If it comes down to an I-said/they-said about whether I
           | checked some box, then:
           | 
           | - if it's my memory against their database, I'd probably
           | lose.
           | 
           | - if it's my video capture against their database, it could
           | go either way.
           | 
           | - if it's me and several others' videos against their
           | database, we'd likely win.
           | 
           | But besides the win-lose proposition, it'd be worth something
           | to me to be confident whether I was right or not.
        
             | mynameisvlad wrote:
             | But who are you trying to convince?
             | 
             | If it's them, they can just say "ok so what" to your
             | footage, it existing or not or coming from multiple people
             | or not is not entirely relevant, as a company they can
             | choose to use it or throw it away.
             | 
             | If it's everyone else, why? What benefit does it get you to
             | have others know you're right? You're still in the same
             | position, with your choice discarded.
        
               | tlb wrote:
               | It could be a journalist writing a story about how some
               | company sells your information despite users checking
               | "[x] don't sell my information".
               | 
               | It could be a government consumer protection agency
               | investigating the company, or a court asked to fine the
               | company.
        
         | that_guy_iain wrote:
         | Would be a privacy nightmare. But some companies do, using
         | hotjar and other tools.
        
         | failrate wrote:
         | Multiple products exist to take multiple screenshots over a
         | time period including details on which application was in
         | focus. This might be an approach thqt would work for you.
        
       | dundalk03 wrote:
       | Yeah check out Trint.com. Their privacy and security are
       | unparallel and they don't listen to any conversations.
        
       | csilverman wrote:
       | This is egregious--and potentially illegal--enough that I suspect
       | it's unintentional (bug, bad UX, misconfiguration, etc), but
       | given how accustomed all of us are to the obnoxious, grubby
       | tactics favored by tech companies, I'm not remotely surprised
       | that the immediate assumption is that it was an intentional,
       | malicious decision.
       | 
       | That's the problem with the selfish, user-hostile approach that a
       | lot of businesses seem to think is acceptable. Short-term, maybe
       | it works, but it breeds a deep cynicism and suspicion that no
       | company should want in its customers. When things go wrong--and
       | they will--you will not be given any benefit of the doubt, nor
       | will you deserve it.
        
       | anigbrowl wrote:
       | Sonix.ai does a good job, though I don't know how it is for
       | integration with meetings etc., it's aimed more at audio/video
       | producers.
        
       | toomuchtodo wrote:
       | File a complaint with the FTC.
       | 
       | https://reportfraud.ftc.gov/
       | 
       | If you're in a two party consent state, probably worth the effort
       | to file a complaint with your state's attorney general.
        
       | JAA1337 wrote:
       | Are you serious??!? This feels like an abuse of trust.
        
       | userhacker wrote:
       | I created revoldiv.com. It's privacy focused and login is not
       | required to transcribe. You can record your meeting and upload
       | the video or audio to transcribe it
        
       | latetomato wrote:
       | THANK YOU for posting this. My partner uses Otter, and HIS Otter
       | bot will join my work meetings that he is not even invited to.
       | He's sent me some recordings within the app in the past and we
       | share calendars, but it still doesn't explain why this would
       | happen. We've emailed Otter multiple times since July 2021
       | without any satisfactory response or investigation. You'd think
       | they would take this privacy issue more seriously. Truly absurd.
        
       | xyos wrote:
       | I just deleted my account because of that, also they share
       | conversations with everyone in the meeting by DEFAULT, it wasn't
       | nice receiving a message from a co-worker telling me that he
       | received an email with the transcription it generated.
        
         | and0 wrote:
         | Yeah, I just signed up to play with this and was surprised that
         | a random "get started" sort of forced me into it to continue.
         | After I noticed an incredibly faint "SKIP" text button in the
         | corner. I disabled it afterwards.
        
         | AJRF wrote:
         | Also deleted account. If anyone from Otter.ai is here - don't
         | do this, reverse this change, email anyone affected since the
         | change and apologise and also, i'd advise you to lawyer up.
        
         | arikr wrote:
         | Wtf, that's terrible UX.
        
           | yashap wrote:
           | For sure, it's almost certainly a "growth hacking" attempt
           | (they're hoping the ppl they email signup to otter.ai), but a
           | pretty terrible one. Every other similar growth hacking
           | attempt I've seen will explicitly ask for consent before
           | sending anything to non-users of the app.
           | 
           | Overall, sounds like a VERY unscrupulous company, that you
           | shouldn't trust with your personal data.
        
           | refulgentis wrote:
           | Even more terrible is using a recording service when you're
           | in a meeting with me, not telling me, then complaining the
           | real problem is the service told me
        
           | kordlessagain wrote:
           | That's not what the UI says, or does. I signed up and
           | actually looked, before posting.
        
         | turtleman1338 wrote:
         | If some of the participants are located in the EU this is a
         | GDPR violation.
        
           | airstrike wrote:
           | If any of the participants are in many US states including
           | California, this is also likely illegal.
           | 
           | https://www.romanolaw.com/2022/05/09/are-recorded-
           | conversati....
        
           | [deleted]
        
           | that_guy_iain wrote:
           | How so?
        
             | Etheryte wrote:
             | If the bot transcribes anything GDPR considers personal
             | data and then emails it out you're already in breach.
             | Simply someone telling their phone number to another
             | participant over the call would result in a violation.
        
               | that_guy_iain wrote:
               | It wouldn't be in breach. otherwise, you sending an email
               | with someone's email/phone number via gmail in it would
               | cause Google to breach it. That would be on the software
               | user. If you use somewhere that to breach GDPR, the
               | software provider is not liable for how the user uses it.
        
               | pbhjpbhj wrote:
               | If you made a draft email with those details, then GMail
               | sent that draft without your authority then they too
               | would be in breach, albeit unwittingly. That seems like a
               | reasonable analogy to what happened in the OP, assuming
               | it was a bug and not some 'growth hacking' plan as others
               | have speculated.
        
               | dspillett wrote:
               | If you tell the software to do something that is a
               | breach, or used it when you could reasonably be expected
               | to know it would behave in such a way, then yes you are
               | responsible.
               | 
               | If, for custom software/configuration, you specify
               | software to do something automatically that is a breach,
               | then yes also.
               | 
               | If the software does it without your instruction,
               | _especially_ if you explicitly opted out, that is in
               | beach, then the service responsible for the software may
               | be liable instead.
               | 
               | How _enforceable_ this is, is a different discussion...
        
               | handoflixue wrote:
               | If the software is transcribing and sending emails
               | without the user's awareness, much less permission...
               | then the responsibility must rest with the software.
               | Software isn't responsible for my deliberate decisions,
               | but it is responsible for anything it does without
               | consulting me.
        
               | that_guy_iain wrote:
               | That isn't going to fly is it? I didn't know this
               | software did this when configured in this way isn't
               | really something that moves the liability.
        
               | sulam wrote:
               | He specifically _didn't_ configure it in that way.
        
               | that_guy_iain wrote:
               | How many times has a user said they didn't configure
               | something that way and when you checked they did and they
               | were thinking about a different setting for something
               | else? Realisitically, it's actually the most likely thing
               | that happened.
               | 
               | And he did have it configured to send out emails. So that
               | part is true.
        
               | pessimizer wrote:
               | If they're lying or mistaken, let us discuss the lie or
               | mistake, and even the possibility that there could have
               | been a lie or mistake (based on _something_.) Just
               | dismissing it based on a obvious counterfactual that you
               | 're declaring based on no evidence other than that you
               | think users lie and developers don't make mistakes is a
               | waste of time.
               | 
               | If you have some kind of insight into the specific
               | configuration of otter.ai, or evidence that this specific
               | person has made a mistake (and that all of the other
               | people that have also seen this behavior are also
               | mistaken) that would be constructive. As it is, you're
               | not even carrying water for a company, you're offering
               | water to a company that didn't ask for it by denigrating
               | the people around you.
        
         | darkteflon wrote:
         | Same, deleted my account over this. One of the most comically
         | egregious UX fuckups I've ever seen.
         | 
         | Maya Angelou is apposite: "When people show you who they are,
         | believe them the first time."
        
       | hiidrew wrote:
       | This is unfortunate. I was a fan of the service in grad school,
       | used it a lot for qualitative interviews. I hope you can find a
       | better service, the comment on Alice looks promising.
        
       | colechristensen wrote:
       | There are plenty of places where this is illegal wiretapping,
       | whether it's you doing it or otterai is debatable.
       | 
       | https://recordinglaw.com/party-two-party-consent-states/
        
         | mackmgg wrote:
         | In an all-party consent state you would need everyone in the
         | call to consent for recording to be legal, but even in one-
         | party consent states you would still at least need OP's consent
         | to be recording! If the user says no and they record anyway, I
         | can't think of any state where that's legal (though the EULA
         | probably has something covering that, unsure if that holds up
         | in court).
        
       | jorams wrote:
       | > The bot proceeded to join two confidential meetings on my
       | behalf and record the whole thing, then email every member an
       | absurd, inaccurate "outline" after.
       | 
       | Can you explain how it was able to do this, apparently without
       | people noticing? From what I understand it joins the meeting as a
       | participant, which would be pretty obvious?
        
         | squeaky-clean wrote:
         | I'm curious about this too. Possibly some large internal
         | meeting, like a company quarterly update on performance and
         | plans going forward? If there's 200 attendees someone might not
         | notice a random one. Though Google still requires approval
         | whenever someone outside your domain joins a meeting, so I feel
         | like that would get caught easily too.
        
         | arcticfox wrote:
         | To clarify, it joined as a very obvious user. It was not
         | without people noticing, they didn't really seem to care but I
         | did. I was not the meeting organizer so I could not kick it and
         | didn't want to derail the discussion with explaining how I
         | couldn't control my own computer (great look in front of
         | clients), so we just proceeded with it recording.
         | 
         | The automatically emailed outline afterwards was the cherry on
         | top, though, as it was filled with AI generated questions that
         | I absolutely did not need answered. I got one response from
         | someone that patiently explained some of the answers to the
         | (idiotic) questions - how embarrassing.
        
       | dsiroker wrote:
       | We are building a privacy-first transcription & recording
       | solution. Sign up at https://www.rewind.ai for early access or if
       | you are super eager email me at dan@ at the domain above.
        
         | nerdponx wrote:
         | Unless it's entirely self-hostable and/or runs entirely locally
         | on the user's machine, it's a hard pass from me.
         | 
         | Also, this marketing copy makes your product sound a lot more
         | like some kind of AI hype scam than an actual product:
         | 
         | > What if we could use technology to augment our memory the
         | same way a hearing aid can augment our hearing? This question
         | is why we founded Rewind. > > Our vision is to give humans
         | perfect memory. > > We are building a search engine for your
         | life.
        
           | dsiroker wrote:
           | > Unless it's entirely self-hostable and/or runs entirely
           | locally on the user's machine, it's a hard pass from me.
           | 
           | It runs locally on your machine.
           | 
           | > Also, this marketing copy makes your product sound a lot
           | more like some kind of AI hype scam than an actual product
           | 
           | Yea, I can see your perspective. We're still in stealth mode
           | so will be more forthcoming soon. For more context, here is
           | the full founding story:
           | 
           | I started to go deaf in my 20s. When I turned 30, a hearing
           | aid changed my life. To lose a sense and gain it back again
           | feels like gaining a superpower. Ever since that moment, I've
           | been on a hunt for ways technology can augment human
           | capabilities and give us superpowers.
           | 
           | That hunt ultimately led me to memory. Studies show 90% of
           | memories are forgotten after a week. Just like going deaf,
           | our memory gets worse as we get older. But does it have to?
           | If we have hearing aids for hearing and glasses for vision,
           | what's the equivalent for memory?
           | 
           | What if we could use technology to augment our memory the
           | same way a hearing aid can augment our hearing? This question
           | is why we founded Rewind. Our vision is to give humans
           | perfect memory. We are building a search engine for your
           | life.
        
           | satyrnein wrote:
           | That was a pretty good Black Mirror episode!
        
             | dsiroker wrote:
             | Fantastic episode! "The Entire History of You"
             | 
             | The biggest difference is that I believe we would all be
             | more honest with one another if we had perfect memory.
             | 
             | It would prevent more marital strife than it would create.
        
               | satyrnein wrote:
               | I hope you're right, because that world is probably
               | coming, but to play devil's advocate:
               | 
               |  _The biggest difference is that I believe we would all
               | be more honest with one another if we had perfect
               | memory._
               | 
               | This reminds me of Mark Zuckerberg believing the world
               | would be a better place if we couldn't have multiple
               | identities, because multiple identities showed a lack of
               | integrity.
               | 
               |  _It would prevent more marital strife than it would
               | create._
               | 
               | It depends on which way the causality goes in "forgive
               | and forget"!
        
       | T3RMINATED wrote:
        
       | puma_ambit wrote:
       | I stopped using them when I saw they share with everyone by
       | default. I wanted the notes only for myself so I could be more
       | present in meetings but not at the expense of making everyone
       | feel like they have to watch what they're saying because it's
       | being transcribed.
        
         | formerly_proven wrote:
         | That approach sounds like a potential problem for more-than-one
         | party consent jurisdictions.
        
           | [deleted]
        
           | jeroenhd wrote:
           | Even in one party consent jurisdictions you're not just
           | allowed to send a copy off to whoever you like. Often it's
           | permitted for personal review or for gathering evidence, but
           | it's not necessarily allowed to sell the conversation (or
           | send it to a third party in exchange for something else, like
           | this transcription).
        
         | jeroenhd wrote:
         | That's really only just hiding the fact that they should be
         | watching what they're saying, though. If someone is sending a
         | copy of our private conversation to a random third party, I'd
         | certainly like to know about it.
         | 
         | If anything, this "warning" only makes me trust this service
         | more because I don't have to wonder if this service is
         | listening in.
        
       ___________________________________________________________________
       (page generated 2022-09-07 23:01 UTC)