[HN Gopher] FBI's ability to legally access secure messaging app...
___________________________________________________________________
FBI's ability to legally access secure messaging app content and
metadata [pdf]
Author : sega_sai
Score : 215 points
Date : 2021-11-30 19:53 UTC (3 hours ago)
(HTM) web link (propertyofthepeople.org)
(TXT) w3m dump (propertyofthepeople.org)
| stunt wrote:
| Well, who cares when all they need is to use something like
| Pegasus to obtain full access to your phone simply by sending you
| a WhatsApp message (without having you even open the message).
|
| Knowing how well guarded IOS is against app developers, I wonder
| what kind of zero-day would suddenly turn a message received in
| WhatsApp to full system access. I think NSO found a WhatsApp
| backdoor, not a zero-day bug.
| JBiserkov wrote:
| NSO can't send you an WhatsApp message if you don't have
| WhatsApp on your iPhone.
| catlikesshrimp wrote:
| Whatsapp is owned by facebook, not by apple. I don't think
| Apple wants to share a backdoor with facebook.
|
| I don't know any detail of the whatsapp vulnerability that NSO
| exploited.
| xanaxagoras wrote:
| They left off one very popular messenger, SMS:
|
| * Message content: All
|
| * Subpoena: can render all message content for the last 1-7 years
|
| * 18 U.S.C 2703(d): can render all message content for the last
| 1-7 years
|
| * Search warrant: can render all message content for the last 1-7
| years
|
| * Vague suspicion plus a small fee to the carrier: can render all
| message content for the last 1-7 years
| heavyset_go wrote:
| There's also:
|
| * Law enforcement simply asks nicely: can render all message
| content for the last 1-7 years
| Consultant32452 wrote:
| * Law enforcement wants to stalk ex-girlfriend: can render
| all message content for the last 1-7 years
| gnopgnip wrote:
| The Stored Communications Act makes disclosing the contents
| of messages without a search warrant unlawful
| kevin_thibedeau wrote:
| EO12333 makes it lawful without a warrant.
| jfrunyon wrote:
| The reality is that many times the only barrier to
| sensitive information is a shared login which many people
| know and a statement that users represent that they have
| legal authority to access that info.
| AnthonyMouse wrote:
| The people responsible for investigating and prosecuting
| such crimes have some not so great incentives to avoid
| doing so and keep the whole thing secret though, don't
| they?
|
| And then when they get caught, they do this:
|
| https://cdt.org/insights/the-truth-about-telecom-immunity/
| 2OEH8eoCRo0 wrote:
| Sounds like an easy way to have your case tossed out in
| court.
|
| It's funny how much this differs from my own personal
| experience with law enforcement. The friends I know are
| timid as hell and don't do anything without a warrant
| just to stay on the safe side- even if they probably
| don't need one.
| marricks wrote:
| I'm just glad you're here to stick up for your friends
| without any corroboration or linking story. It's just a
| good thing to do.
| op00to wrote:
| The really smart cops get the tips using "less than
| legal" means, then walk back and reconstruct using legal
| evidence.
| a4isms wrote:
| "Parallel Construction:"
|
| https://en.wikipedia.org/wiki/Parallel_construction
| efitz wrote:
| This discussion is not very interesting from a security
| perspective. I tuned out at "cloud".
|
| If it's not in your physical possession, it's not your computer.
| If it's not your computer, then whoever administers the computer,
| or whoever [points a gun at/gives enough money to] the
| administrator of that system can access whatever you put on that
| system.
|
| If a "cloud" or "service" is involved, then you can trivially use
| them to move or store data that you encrypted locally on your
| computer with your key that was generated and stored locally and
| never left your system. But subject to the limits above, the
| administrators of the other computers will still be able to see
| metadata like where the data came from and is going to. And they
| might be able to see your data too if you ever (even once, ask
| Ross Ulbrecht) failed to follow the basic encryption guidelines
| above.
|
| You can make metadata access harder via VPNs and Tor, but you
| CANNOT make it impossible- in the worst case, maybe your
| adversary is controlling all the Tor nodes and has compromised
| the software.
|
| Which leads me to my last point, if you did not write (or at
| least read) the code that you're using to do all of the above,
| then you're at the mercy of whoever wrote it.
|
| And, if you try to follow perfect operational security, you will
| have a stressful and unpleasant life, as it's really really hard.
| judge2020 wrote:
| If i'm reading this page correctly, AMD is working on something
| that would allow you to run trusted code that not even someone
| with physical access to the hardware could read (without
| breaking this system).
|
| https://www.amd.com/en/processors/epyc-confidential-computin...
|
| And this tech is already implemented by GCP:
|
| https://cloud.google.com/confidential-computing
|
| > With the confidential execution environments provided by
| Confidential VM and AMD SEV, Google Cloud keeps customers'
| sensitive code and other data encrypted in memory during
| processing. Google does not have access to the encryption keys.
| In addition, Confidential VM can help alleviate concerns about
| risk related to either dependency on Google infrastructure or
| Google insiders' access to customer data in the clear.
| efitz wrote:
| Then you only have to trust that AMD did not accidentally or
| intentionally introduce a bug in the system. Remember
| Spectre? Remember all the security bugs in the Intel
| management code?
|
| You also have to trust that AMD generated and have always
| managed the encryption keys for that system properly and in
| accordance with their documentation.
|
| And are you even sure that you're actually running on an AMD
| system? If the system is in the cloud, then it's hard to be
| sure what is executing your code.
|
| And are you sure that your code didn't accidentally break the
| security guarantees of the underlying system?
|
| I have worked on all these problems in my day job, working on
| HSMs. At the end of the day there are still some leaps of
| faith.
| smoldesu wrote:
| _puts on tinfoil hat_
|
| You'd also need to consider AMD's management engine, the
| Platform Security Processor. If we're really slinging
| conspiracy theories, AMD processors are likely just as
| backdoored as Intel one. I don't mean to be grim, but I
| think it's safe to assume that the US government has direct
| memory access to the vast majority of computer processors
| you can buy these days.
|
| [/conspiracy]
| 123pie123 wrote:
| if you're going to that level, then have a look at five-
| eyes (and it's derivatives)
| https://en.wikipedia.org/wiki/Five_Eyes / Echelon
| smoldesu wrote:
| I probably shouldn't have removed my tinfoil lining yet
| but yes, you're correct. Any information the US
| government has access to through these channels is also
| probably accessible by our surveillance/intelligence
| allies. It raises a lot of questions about how deep the
| rabbit hole goes, but I won't elucidate them here since
| I've been threatened with bans for doing so. I guess it's
| a do-your-own research situation, but always carry a
| healthy degree of skepticism when you read about anything
| government-adjacent.
| michaelmior wrote:
| > if you did not write (or at least read) the code that you're
| using to do all of the above, then you're at the mercy of
| whoever wrote it.
|
| It's worse than that. Even if you read the code, you have to
| trust that the code you read is the code a service is actually
| using. Even if you deploy the code yourself, you have to trust
| that the infrastructure you're running on does not have some
| type of backdoor. Even if you run your own infrastructure,
| hardware can still have backdoors. Of course, the likelihood of
| any of these things actually becoming a problem decreases
| significantly as you read through the paragraph.
| dointheatl wrote:
| > Even if you read the code, you have to trust that the code
| you read is the code a service is actually using.
|
| Don't forget to verify the code for the compiler to ensure
| that hasn't been compromised in order to inject an exploit
| into the binary at compile time.
| inetknght wrote:
| > _the likelihood of any of these things actually becoming a
| problem decreases significantly as you read through the
| paragraph._
|
| And yet, "likelihood" doesn't necessarily mean "hasn't been
| done".
|
| Just look at:
|
| * [0]: Intel ME
|
| * [1]: Solarwinds attack and CI systems
|
| * [2]: Ubiquiti attack and complete infrastructure compromise
|
| * [3]: And the famous Ken Thompson statement
|
| [0a]: https://news.ycombinator.com/item?id=15298833
|
| [0b]: https://www.blackhat.com/eu-17/briefings/schedule/#how-
| to-ha...
|
| [1]: https://www.cisecurity.org/solarwinds/
|
| [2]: https://krebsonsecurity.com/2021/04/ubiquiti-all-but-
| confirm...
|
| [3]: https://users.ece.cmu.edu/~ganger/712.fall02/papers/p761
| -tho...
| yownie wrote:
| link seems to be broken
|
| https://propertyofthepeople.org/document-detail/?doc-id=2111...
| mmh0000 wrote:
| Not only is there main link broken, but their silly PDF reader
| is broken for me.
|
| Here's a direct link to the PDF:
| https://assets.documentcloud.org/documents/21114562/jan-2021...
| [deleted]
| finite_jest wrote:
| Dupe of https://news.ycombinator.com/item?id=29394945 and
| https://news.ycombinator.com/item?id=29394945.
| t0mas88 wrote:
| So if you have something to hide, don't use iCloud backup.
|
| And Whatsapp will give them the target's full contactbook (was to
| be expected), but _also_ everyone that has the target in their
| contact list. That last one is quite far reaching.
| fumar wrote:
| Has Apple made any public statements regarding iCloud's lack of
| privacy features. It takes the wind out of their privacy
| marketing that is effectively hurting ad tech but not truly
| protecting consumers from state-level actors with data access.
| nicce wrote:
| E2EE was in the iOS 15 beta for backups but it was removed?
| (Did not land for release) after they changed the time table
| of CSAM scanning feature. So we will see if we get E2EE
| backups once that image scanning lands.
| amatecha wrote:
| Kind of. These details are indeed publicly written on their
| website[0]. Do many users ever read this page? Probably not.
|
| [0] https://support.apple.com/en-us/HT202303
| fumar wrote:
| Here is an excerpt. The language sounds like encryption is
| enabled and the chart includes iCloud features as server
| and in transit protected. Seems like smoke and mirrors
| then.
|
| > On each of your devices, the data that you store in
| iCloud and that's associated with your Apple ID is
| protected with a key derived from information unique to
| that device, combined with your device passcode which only
| you know. No one else, not even Apple, can access end-to-
| end encrypted information.
| lupire wrote:
| If you have something to hide, don't give a copy to _any_
| third-party.
|
| even a second-party is a risk.
| sschueller wrote:
| Can you turn that off if you have icloud or do you need to not
| use icloud all together?
| [deleted]
| KennyBlanken wrote:
| Yes, and you can delete old backups on iCloud - and then
| switch to local, automatic, fully encrypted backups to a Mac
| or PC running iTunes.
|
| HN tends to get very frothy-at-the-mouth over Apple and
| privacy but the reality is that iPhones can be easily set up
| to offer security and privacy that best in class, they play
| well with self-hosted sync services like Nextcloud....and
| unlike the Android-based "privacy" distros you're not running
| an OS made by a bunch of random nameless people, you can use
| banking apps, etc.
|
| The only feature I miss is being able to control background
| data usage like Android does.
| ceejayoz wrote:
| You can turn it off individually just for Messages, but
| you're still left not knowing the state of the setting on the
| other end.
| georgyo wrote:
| You and the person you are communicating with must both not use
| iCloud backup. And since apple pushes the backup features
| pretty heavily, you can be reasonable sure that the person you
| are communicating is using backups. IE, you cannot use
| iMessage.
| vmception wrote:
| iCloud backup can backup your whole phone, specifically the
| files section. iOS and OSX users can save anything to that.
| xanaxagoras wrote:
| I got off all Apple products when they showed me their
| privacy stance is little more than marketing during the CSAM
| fiasco, but IIRC the trouble with iCloud backup is it stores
| the private key used to encrypt your iMessages backup. Not
| ideal to be sure, but wouldn't iMessage users be well
| protected against dragnet surveillance, or do we know that
| they're decrypting these messages en masse and sharing them
| with state authorities?
| kf6nux wrote:
| > if you have something to hide
|
| Most people don't realize that most people have something to
| hide. The USA has so many laws on its books. Many of which are
| outright bizarre[0] and some of which normal people might
| normally break[1].
|
| And that's only counting _current /past_ laws. It wasn't that
| long ago a US President was suggesting all Muslims should be
| forced to carry special IDs[2]. If you have a documented
| history being a Muslim, it could be harder to fight a non-
| compliance charge.
|
| [0] https://www.quora.com/Why-is-there-a-law-where-you-can-t-
| put...
|
| [1] https://unusualkentucky.blogspot.com/2008/05/weird-
| kentucky-...
|
| [2] https://www.snopes.com/fact-check/donald-trump-muslims-id/
| hunterb123 wrote:
| > "Certain things will be done that we never thought would
| happen in this country in terms of information and learning
| about the enemy," he added. "We're going to have to do things
| that were frankly unthinkable a year ago."
|
| > "We're going to have to look at a lot of things very
| closely," Trump continued. "We're going to have to look at
| the mosques. We're going to have to look very, very
| carefully."
|
| That's all he said to the interviewer. The interviewer was
| asking the hypothetical and suggested the special
| identification! He wouldn't take the bait, so since he didn't
| answer the hypothetical they said "he wouldn't deny it" and
| wrote the campaign of hit piece articles anyway. Whatever
| response they got they would have wrote that same piece. If
| he would have answered one way they would have quoted out of
| context. Since he responded generically it's obviously
| drummed up. The fact check is hilarious. "Mixed", lol.
|
| Never answer a hypothetical, it's always a trap.
| president wrote:
| Did you even read the snopes article you referenced before
| making what seems like a definitive claim about how Trump was
| suggesting muslims carry special IDs? Because Snope's own
| rating is "Mixture" of truth and false and if you read the
| assessment, it is grasping at straws to even make that
| conclusion.
| xster wrote:
| This seems like a good place to say that I strongly recommend
| Yasha Levine's Surveillance Valley book
| (https://www.goodreads.com/book/show/34220713-surveillance-va...)
| where he suggests that all of this is working as intended, going
| all the way back to the military counter-insurgency roots of the
| arpanet first in places like Vietnam, and then back home in anti-
| war and leftist movements. The contemporary themes that are
| relevant are the fact that current privacy movements like Tor,
| Signal, OTF, BBG are fundamentally military funded and survive on
| government contracts. It distracts from the needed political
| discourse into a technology one where "encryption is the great
| equalizer" and everyone can resist big brother in their own way
| on the platforms the government has built. Encryption does exist,
| but it also distracts from other vectors like vulnerabilities
| (that led to Ulbricht getting caught), what services you would
| e2e connect to, how you get the clients to connect to those
| services, what store can push binaries for said clients etc.
| hutzlibu wrote:
| "are the fact that current privacy movements like Tor, Signal,
| OTF, BBG are fundamentally military funded and survive on
| government contracts."
|
| Are those "facts" avaiable for investigating, without having to
| buy the book?
|
| (that Tor is partly US administration funded is known, but
| Signal? And what is OTF and BGG?)
| baby wrote:
| I'm wondering how this was obtained, and how old this is?
|
| For WhatsApp:
|
| > if target is using an iPhone and iCloud backups enabled, iCloud
| returns may contain WhatsApp data, to include message content
|
| Probably not true since WhatsApp launched encrypted backups.
| vorpalhex wrote:
| Reading the document answers this for you: It is a declassified
| government document originally produced by the FBI and was
| prepared on Jan 2nd, 2021.
| the_optimist wrote:
| Isn't this simply imaginary, where in practice all the FBI has to
| do to up the ante is to request military-grade interception from
| a willing foreign counterpart?
| anonporridge wrote:
| The point of promoting and using privacy respecting software is
| not necessarily to make it _impossible_ for law enforcement to
| get what they want. It 's to make it somewhat expensive and
| require targeted probes.
|
| You simply want it to be cost prohibitive to engage in mass
| surveillance on everyone, because that is an immensely powerful
| tool of totalitarian oppression that get really bad if we
| happen to elect the wrong person once.
| the_optimist wrote:
| I agree with you on the level of my person, and naturally
| flag that this economic argument is extremely poor policy.
| It's quite unclear that the marginal cost is non-zero, or
| even flat by person. One might reasonably conclude we are
| already each inside a high-resolution springing trap, waiting
| for the moment we find ourselves athwart the powers that be.
| Imagine in physical space where the local police could simply
| call in foreign air strikes upon domestic citizens, with only
| economics to prevent otherwise. We must have transparent and
| firm laws, reformed at a fundamental level.
| wizzwizz4 wrote:
| I don't care if they spy on _me_ ; they probably have a good
| reason to! But I do care if they spy on _everyone_ , so I
| make it hard to spy on me.
| upofadown wrote:
| Can't the FBI do a Pegasus style remote access thing on an
| appropriate warrant themselves?
| tata71 wrote:
| Tier of which requires....expense.
| goatcode wrote:
| It's kind of funny that WeChat seems pretty locked-down to the
| FBI, especially for Chinese citizens. Makes sense, really, but
| still funny.
| headphoneswater wrote:
| If this is what it takes to keep us safe, I and most americans
| are ok with it. We live in dangerous times.
|
| The US has a balanced criminal justice system -- as long as due
| process is preserved privacy from the state should not be a major
| issue
| benlivengood wrote:
| Current U.S. "due process" includes national security letters
| and other secret legal requests and secret courts to approve
| those requests. So there are still some checks and balances but
| it's less clear that they are working well enough or as
| intended.
|
| Just look at the transparency reports of major Internet
| companies; they can report numbers of (certain types of)
| requests and that's about it. Mass surveillance under seal is
| not a great trend.
|
| When political parties start advocating for jailing political
| opponents and treating the supreme court as political office
| for nominations, I find it harder to trust the current due
| process.
| wizzwizz4 wrote:
| > _if it 's not we have bigger problems anyway_
|
| Really, we shouldn't do anything; we have the bigger problem of
| the eventual heat death of the universe.
|
| Take into account not only the size of the problem, but how
| easy it is to do something about it.
| numlock86 wrote:
| How does them reading my messages keep me safe, though?
| headphoneswater wrote:
| In that example it's keeping me safe from you
| georgyo wrote:
| It says Telegram has no message content. Isn't telegram not E2EE
| by default, instead required explicit steps to make a
| conversation encrypted?
|
| Either way looks like Signal wins by a lot. The size of it spot
| is so small, it seems almost squeezed in. But only because they
| have nothing to share.
| vadfa wrote:
| That is correct. By default all messages sent over Telegram are
| stored permanently in their servers unencrypted.
| skrowl wrote:
| Source for Telegram storing the information unencrypted at
| rest?
| pgalvin wrote:
| It is widely known and confirmed by Telegram themselves
| that your messages are encrypted at rest by keys they
| possess.
|
| This is a similar process to what Dropbox, iCloud, Google
| Drive, and Facebook Messenger do. Your files with cloud
| services aren't stored unencrypted on a hard drive -
| they're encrypted, with the keys kept somewhere else by the
| cloud provider. This way somebody can't walk out with a
| rack and access user data.
| Borgz wrote:
| Not exactly. Non-secret chats are stored encrypted on
| Telegram's servers, and separately from keys. The goal seems
| to be to require multiple jurisdictions to issue a court
| order before data can be decrypted.
|
| https://telegram.org/privacy#3-3-1-cloud-chats
| https://telegram.org/faq#q-do-you-process-data-requests
| anon11302100 wrote:
| "Not exactly" means "completely incorrect" now?
|
| Telegram doesn't store your messages forever and they are
| encrypted and seizing the servers won't allow you to
| decrypt them unless you also seize the correct servers from
| another country
| skrowl wrote:
| Telegram is encrypted OVER THE WIRE and AT REST by default with
| strong encryption no matter what you do. It's E2EE if you
| select private chat with someone.
|
| Lots of FUD out there there about Telegram not being encrypted
| that's just not true. There's nothing either side can to do
| send a message in clear text / unencrypted.
| 542458 wrote:
| For somebody who isn't super cyprtography-savvy, what's the
| difference between over the wire and e2ee? Does the former
| mean that telegram itself can read non-private-chat messages
| if it so chooses?
| andreyf wrote:
| yes. worth remembering also that even with e2ee, a ad-tech-
| driven company could have endpoints determine marketing
| segments based on content of conversations ad report those
| to the company to better target ad spend.
| skinkestek wrote:
| Also, as is the case with WhatsApp, they siphon off your
| metdata and even have the gall to make an agreement with
| Google to download message content _unencrypted_ to
| Google when one enable backups.
| skinkestek wrote:
| > For somebody who isn't super cyprtography-savvy, what's
| the difference between over the wire and e2ee?
|
| E2EE: As long as it is correctly set up and no significant
| breakthroughs happens in math, nobody except the sender,
| the receiver can read the messages.
|
| > Does the former mean that telegram itself can read non-
| private-chat messages if it so chooses?
|
| Correct. They say they store messages encrypted and store
| keys and messages in different jurisdictions, effectively
| preventing themselves from abusing it or being coerced into
| giving it away, but this cannot be proven.
|
| If your life depends on it, use Signal, otherwise use the
| one you prefer and can get your friends to use (preferably
| not WhatsApp though as it leaks all your connections to
| Facebook and uploads your data _unencrypted_ to Google for
| indexing(!) if you enable backups.
|
| Edited to remove ridiculously wrong statement, thanks kind
| SquishyPanda23 who pointed it out.
| SquishyPanda23 wrote:
| > nobody except the sender, the receiver and the service
| provider can read the messages
|
| E2EE means the service provider cannot read the messages.
|
| Only the sender and receiver can.
| skinkestek wrote:
| Thanks! I edited a whole lot and that came out
| ridiculously wrong! :-)
| Gigachad wrote:
| Pretty much. End to end uses the encryption keys of both
| _users_ to send. Over the wire has both sides use the
| platforms keys so the platform decrypts, stores in plain
| text, and sends it encrypted again to the other side. Over
| the wire is basically just HTTPS.
| Daegalus wrote:
| over the wire is when its encrypted during transmission
| between the User and Telegram's servers. HTTPS or SSL/TLS,
| etc. At Rest is when its encrypted in their DBs or hard
| drives, etc. Theoretically, Telegram can still read the
| contents if they wished to do so if they setup the
| appropriate code, or tools inbetween these steps.
|
| E2EE means that the users exchange encryption keys, and
| they encrypt the data at the client, so that only the other
| client can decrypt it. Meaning Telegram can never inspect
| the data if they wanted to.
| loeg wrote:
| Yeah, if you connect to https://facebook.com and use
| messenger, it's encrypted over the wire because you're
| using HTTPS (TLS). But it's not E2EE.
| blueprint wrote:
| (how) does the telegram server prevent unencrypted content?
|
| also curious - how does telegram support encryption for
| chatrooms without the parties being known in advance? or are
| those chats not encrypted?
| to11mtm wrote:
| I don't know whether Telegram is E2EE by default (probably
| not.) When you do a call on telegram you are given a series of
| emoji and they are supposed to match what the person on the
| other side has, and that's supposed to indicate E2EE for that
| call.
| tptacek wrote:
| It is not, by default, and none of the group chats are.
| RL_Quine wrote:
| Verification in band seems pretty meaningless, approaching
| security theatre.
| nitrogen wrote:
| For voice? It's hard to fake the voice of someone you know.
| Muromec wrote:
| you don't have to fake the voice, just mitm and record
| cleartext
| ajsnigrutin wrote:
| But they have to fake the voice, if I call the other
| person and say "my emoji sequence is this, this and that"
| for the other person to verify and vice-versa.
| wizzwizz4 wrote:
| Person A calls you. I intercept the call, so person A is
| calling _me_ , and then I call you (spoofing so I look
| like Person A). When you pick up, I pick up, then I
| transmit what you're saying to Person A (and vice versa).
|
| How do you know I'm intercepting the transmission? Does
| the emoji sequence verify the _call_ , perhaps?
| summm wrote:
| Both connections would show different emojis on both
| sides then. So you would need to somehow deep fake the
| voice of the one telling their emojis to the other one.
| tshaddox wrote:
| The emoji sequence is a hash of the secret key values
| generated as part of a modified/extended version of the
| Diffie-Hellman key exchange. The emoji sequence is
| generated and displayed independently on both devices
| _before_ the final necessary key exchange message is
| transmitted over the wire, so a man-in-the-middle has no
| way of modifying messages in flight to ensure that both
| parties end up generating the same emoji sequence.
|
| I'm not a cryptographer, but that's what I glean from
| their explanation: https://core.telegram.org/api/end-to-
| end/video-calls#key-ver...
| nimbius wrote:
| for signal users this means the messages of course _do_ exist
| on your phone, which will be the first thing these agencies
| seek to abscond with once youre detained as its infinitely more
| crackable in their hands.
|
| as a casual reminder: The fifth amendment protects your speech,
| not your biometrics. do not use face or fingerprint to secure
| your phone. use a strong passphrase, and if in doubt, power
| down the phone (android) as this offers the greatest protection
| against offline bruteforce and sidechannel attacks used
| currently to exploit running processes in the phone.
| leokennis wrote:
| My advice if you're not on the level where three letter
| agencies are actively interested in your comings and goings:
|
| - Use a strong pass phrase
|
| - Enable biometrics so you don't need to type that pass
| phrase 100 times per day
|
| - Learn the shortcut to have your phone disable biometrics
| and require the pass phrase so you can use it when police is
| coming for you, you're entering the immigration line in the
| airport etc. - on iPhone this is mashing the side button 5
| times
| wskinner wrote:
| On recent iPhones, the way to disable biometrics is to hold
| the side button and either volume button until a prompt
| appears, then tap cancel. Mashing the side button 5 times
| does not work.
| minhazm wrote:
| Not sure how recent you're talking but I have an iPhone
| 11 Pro and I just tested pressing the side button 5 times
| and it takes me to the power off screen and prompts me
| for my password the same way that side button + volume
| does.
|
| Apple's docs also say that pressing the side button 5
| times still works.
|
| > If you use the Emergency SOS shortcut, you need to
| enter your passcode to re-enable Touch ID, even if you
| don't complete a call to emergency services.
|
| https://support.apple.com/en-us/HT208076
| cyral wrote:
| Pressing it five times starts the emergency SOS countdown
| (and requires the passcode next time) on my iPhone XS.
| Maybe you have the auto-calling disabled?
| croutonwagon wrote:
| Works fine on my 11, my wifes 12, her backup SE gen 2 and
| my backup SE gen1.
|
| Just tested all of them
| 14 wrote:
| I just tried this an it does not work for iPhone is it only
| on a certain iOS? I am a bit behind on updates. Thanks
| ribosometronome wrote:
| That's actually the old method for iPhone 7 and before.
| Now, you can activate emergency SOS by holding the power
| button and one of the volume buttons. Assuming you don't
| need to contact any emergency contacts or services, just
| cancel out of that and your passcode will be required to
| unlock.
|
| https://support.apple.com/en-us/HT208076
| david_allison wrote:
| Try: Hold "volume up" and "power" for 2 seconds
|
| You'll feel a vibration, and biometric login will be
| disabled until you enter your passcode.
| FalconSensei wrote:
| > do not use face or fingerprint to secure your phone
|
| but can't they force you to put your password in that case,
| instead of your finger?
| adgjlsfhk1 wrote:
| no. The 5th amendment has been read weirdly by the supreme
| court.
| caseysoftware wrote:
| In general, no.
|
| The contents of your mind are protected because you must
| take an active part of disclose them. Of course, they can
| still order you to give them the password and stick you in
| jail for Contempt of Court charges if you don't.
|
| Check out Habeas Data. It's a fascinating/horrifying book
| detailing much of this.
| ribosometronome wrote:
| To err on the side of caution, it's best to make all your
| passcodes themselves an admission to a crime.
| Y_Y wrote:
| My passwords are so obscene it's a crime to write them
| down.
| dylan604 wrote:
| great, so they'll just be able to hit you with lewd
| charges on top of everything else they are filing.
| emn13 wrote:
| They don't actually need your passphrase to unlock your
| phone - they just need somebody with the passphrase to
| unlock in for them. And if there's any doubt about who
| that is, then having that passphrase counts as
| testimonial; but if there's not - it might not count as
| testimonial.
|
| Although there are apparently a whole bunch of legal
| details that matter here; courts have in some cases held
| that defendants can be forced to decrypt a device when
| the mere act of being able to decrypt it is itself a
| foregone conclusion.
|
| (If you want to google a few of these cases, the all
| writs act is a decent keyword to include in the search).
|
| The defendant never needs to divulge the passphrase -
| they simply need to provide a decrypted laptop.
| shadowgovt wrote:
| "Your honor, the state agrees to not prosecute on any
| information inferrable from the text of the password."
|
| "Understood. The defendant's Fifth Amendment right to
| protection from self-incrimination is secured. As per the
| prior ruling, the defendant will remain in custody for
| contempt of court until such time as they divulge the
| necessary password to comply with the warrant."
| [deleted]
| randomluck040 wrote:
| I think a fingerprint is easier to get if you're not
| willing to cooperate. However, I think if they really, I
| mean really want your password, they will probably find a
| way to get it out of you. I think it also depends if it's
| the local sheriff asking for your password or someone from
| the FBI while you're tied up in a bunker somewhere in
| Nevada.
| chiefalchemist wrote:
| Apple should allow for 2 PWs, one the real PW, the other
| triggers a "self-destruct" mode.
|
| Knowing that is possible law enforcement would then
| hesitate to ask.
| detaro wrote:
| _using_ such a self-destruct mode would be a certain way
| getting yourself charged with destroying evidence
| /contempt of court/... though.
| dylan604 wrote:
| i was under such duress that i was shaking so badly that
| i made typos in my 30 character password 10 times. the
| loss of evidence is not my fault as it is the people
| putting me under that duress. don't think it'll hold up
| though
| cronix wrote:
| How? They can physically overpower you and place the sensor
| against your finger, or in front of your eye and pry it
| open without your consent and gain access with 0 input from
| you. How do they similarly force you to type something that
| requires deliberate, repeated concrete actions on your
| part?
| jfrunyon wrote:
| The fifth amendment doesn't protect either speech or
| biometrics. Nor does it protect passwords.
| timbit42 wrote:
| Signal recently added 'disappearing messages' which lets you
| specify how long a chat you initiate remains before being
| deleted.
| noasaservice wrote:
| And a screenshot, or another camera, or a rooted phone can
| easily defeat that.
|
| The analog hole ALWAYS exists. Pretending it doesnt is
| ridiculous.
| wizzwizz4 wrote:
| > _And a screenshot, or another camera, or a rooted phone
| can easily defeat that._
|
| Not if the message has already been deleted. Auto-
| deleting messages are so the recipient doesn't have to
| delete them manually, not so the recipient can't possibly
| keep a copy.
| summm wrote:
| Exactly this. Even more: Auto-deleting messages are also
| that the sender doesn't have to delete them manually.
| Most people do not understand this. I even had a
| discussion with an open source chat app implementer who
| insisted on not implementing disappearing messages
| because they couldn't be really enforced.
| [deleted]
| hiq wrote:
| That's a different threat model, no messaging app is
| trying to protect the sender from the receiver.
| Disappearing messages are meant to protect two parties
| communicating with each other against a 3rd party who
| would eventually gain access to the device and its data.
| bigiain wrote:
| Wickr has a "screenshot notification to sender" feature
| (which of course, can be worked around by taking a pic of
| the screen without Wickr knowing you've done it).
| timbit42 wrote:
| What made you think I was pretending it doesn't?
| upofadown wrote:
| Keep in mind that any time a message is on flash storage
| there might be a hidden copy kept for flash technical
| reasons. It is hard to get to (particularly if the disk is
| encrypted) but might still be accessible in some cases.
|
| I think encrypted messengers should have a "completely off
| the record" mode that can easily be switched on and off.
| Such a mode would guarantee that your messages are never
| stored anywhere that might become permanent. When you
| switch it off then everything is wiped from memory. That
| might be a good time to ensure any keys associated with a
| forward secrecy scheme are wiped as well.
| bigiain wrote:
| Not "recently". Disappearing messages have been there for
| at least 5 or 5 years.
|
| Almost _all_ my Signal chats are on 1 week or 1 day
| disappearing settings. It helps to remind everyone to grab
| useful info out of the chat (for example, stick dinner plan
| times/dates/locations into a calendar) rather than hoping
| everybody on the chat remembers to delete messages intended
| to be ephemeral.
|
| The "$person set disappearing messages to 5 minutes" has
| become shorthand for "juicy tidbit that's not to be
| repeated" amongst quite a few of my circl3es of friends.
| Even in face to face discussion, someone will occasionally
| say something like "bigiain has set disappearing messages
| to five minutes" as a joke/gag way of saying what used to
| be expressed as "Don't tell anyone, but..."
|
| (I just looked it up, https://signal.org/blog/disappearing-
| messages/ from Oct 2016.)
| timbit42 wrote:
| Maybe it was only added recently on the desktop client.
| croutonwagon wrote:
| Also IOS has a panic button. Hit the main/screen button (on
| the right) five times really fast and faceid/touchid is
| disabled and passcode is required
| vorpalhex wrote:
| Your statement on the 5th amendment is no longer accurate
| broadly, but the matter still has some cross-jurisdictional
| disagreement: https://americanlegalnews.com/biometrics-
| covered-by-fifth-am...
| torstenvl wrote:
| District courts don't make law. Magistrates working for
| those district courts even less so. The case this news
| article cites has no precedential value anywhere - not even
| within N.D.Cal. - and should not be relied upon.
|
| IAAL but IANYL
| makeworld wrote:
| > the FBI's ability to _legally_ access secure content
|
| Maybe there are laws preventing legal access to message
| content? Maybe related to wherever Telegram is incorporated.
| officeplant wrote:
| It helps Telegram is HQ'd in the UK and the operational
| center is in Dubai.
| rootsudo wrote:
| Does it? UK and Dubai are USA partners in Intelligence
| gathering and work together several times.
|
| Biggest example as of late: https://www.bbc.com/news/world-
| middle-east-58558690
| inetknght wrote:
| > _Maybe there are laws preventing legal access to message
| content?_
|
| Well sure. A lot of laws require a court order. In the U.S.
| that's usually not too difficult.
| jareklupinski wrote:
| is this only page two of an alphabetical list?
|
| or are there no messaging systems with a name before 'i'
| wwww3ww wrote:
| I do DIY encryption with enigma reloaded and it works
| bredren wrote:
| The way these became bullet points on the slide is ~
|
| An active investigation leads an agent to a suspect known to have
| used one of these applications
|
| An administrative subpoena is issued to the company asking for
| what information is available
|
| The company is then ordered by a federal judge to provide
| information related to a particular account or accounts
|
| The company complies.
|
| This is why it is important to understand how your messaging
| service handles data and how you can compromise your own
| safekeeping of all or part of that data.
| timbit42 wrote:
| I'd like to see Tox and Jami.
| wizzwizz4 wrote:
| I read somewhere that Tox's security was compromised.
| timbit42 wrote:
| I'd like to see that. Was it not fixed?
| wizzwizz4 wrote:
| I found https://media.ccc.de/v/rc3-709912-adopting_the_nois
| e_key_exc... which might be it.
| wwww3ww wrote:
| i use enigma reloaded to manually encrypt my messages
| fractal618 wrote:
| Now I just have to get my friends and family to use Signal.
| yabones wrote:
| I've had surprisingly good luck with strong-arming people into
| switching. The important part is having their trust, if they
| don't believe you they won't listen. The next part is to make
| simple, verifiable, and non-technical arguments for switching.
| Believe it or not, almost everybody is willing to take small
| steps if they're free.
|
| Instead of rambling on and on about "end to end encryption" or
| "double-ratchet cryptographic algorithms" or other junk only
| nerds care about, approach it like this:
|
| * There are no ads, and none of the messages you send can be
| used for advertising
|
| * It's not owned by Facebook, Google, Microsoft, or any of the
| other mega-corporations, and you don't need an account on one
| of their sites to use it
|
| * It will still work great if you travel, change providers, etc
|
| * It's much safer to use on public Wi-Fi than other services or
| SMS
|
| Honestly, don't even touch on law enforcement access as in the
| OP. That can strike a nerve for some people. The best appeals
| are the simple ones.
| upofadown wrote:
| Even if you do it is pretty much impossible to get them to
| check their safety numbers and keep them checked.
| catlikesshrimp wrote:
| I have been trying to get people to install signal for 2 years.
| No one has budged.
|
| The day facebook went down for some hours I got phone calls.
| basilgohar wrote:
| I have had some success. It helps that many of the people I
| regularly contact were willing to migrate, even after some
| time. Most already used WhatsApp, so the friction to
| installing a new app was less than someone not accustomed to
| using a dedicated app for messaging.
|
| But most of my American friends that don't have international
| contacts still just use SMS because they are not really
| accustomed to an app such as WhatsApp and so on.
| anonporridge wrote:
| It's incredibly disheartening how difficult it is to get most
| people to care about digital privacy.
| djanogo wrote:
| I switched to signal and got few people to switch too, then
| they started their shit coin(MOB). IMO Signal Messenger is just
| a way for that company to reach their shit coin goals.
| Uninstalled and never recommending that again.
| MatekCopatek wrote:
| I remember many people being pissed off when these features
| were announced some months ago.
|
| As far as I can tell, nothing really happened afterwards. I
| use Signal on a daily basis and haven't noticed any coin-
| related functionalities. Either they were canceled, haven't
| been released yet or they're just buried somewhere deep and
| not advertised.
|
| Do you have a different experience?
| fossuser wrote:
| It's in beta, you can enable it in settings.
|
| It's still a pain to buy MOB in the US so it's not that
| usable in the states. It would have been interesting to me
| if they just used Zcash instead of rolling their own, but
| I'm not sure what's supposed to be special about MOB vs.
| Zcash.
|
| I also don't think it's that big of a deal.
| arthurcolle wrote:
| Are you sure you're not thinking of Telegram? They had a
| thing called Telegram Open Platform or something (TOP rings a
| bell for some reason)
| _-david-_ wrote:
| Signal has some coin
|
| https://support.signal.org/hc/en-
| us/articles/360057625692-In...
| millzlane wrote:
| I think it's https://mobilecoin.com/
| fractal618 wrote:
| Now I just have to get all my friends and family to use Signal.
| tacLog wrote:
| My family has really taken to it. Granted it's mostly just
| message family app to them, but they are very not technically
| fluent but yet seemed to have picked it up just fine. I really
| think this is not discussed when hacker news brings up secure
| messaging. The user experience is so much more important than
| the underlying tech. My family doesn't care about end to end
| encryption. They care about video calling with the press of a
| button, and easy features that are just there and work like
| zoom or the many other software products that they have to use
| work.
|
| Thank you Signal team for focusing so hard on the user
| experience.
| skinkestek wrote:
| Check the difference between Telegram and WhatsApp.
|
| Add to this the fact that WhatsApp
|
| - uploads messages unencrypted to Google if you or someone you
| chat with enable backups
|
| - and send all your metadata to Facebook.
|
| Then remember how many people here have tried to tell us that
| Telegram is unusable abd WhatsApp is the bees knees.
|
| Then think twice before taking security advice from such people
| again.
|
| PS: as usual, if your life depends on it I recommend using
| Signal, and also being generally careful. For post card messaging
| use whatever makes you happy (except WhatsApp ;-)
| zer0zzz wrote:
| This isn't the case anymore with WhatsApp. Backups to iCloud
| and google drive are optionally fully encrypted. You have the
| choice of storing the decryption artifacts on facebooks servers
| (which are held on a Secure Enclave) or to backup the 64
| character decryption code yourself.
| KennyBlanken wrote:
| Telegram defaults to no encryption, does not do encrypted group
| chats, has a home-rolled encryption protocol which almost
| guarantees it's weak as nearly every home-rolled encryption
| system always is (if not also backdoored). Coupled with it
| being headquartered in Russia means it is completely
| untrustable.
|
| The only reason Telegram comes out on top of Whatsapp in the
| document in question is because Telegram is a foreign company
| with little interest in cooperating with a US domestic police
| agency; the FBI has no leverage over Russian companies.
|
| What that list doesn't show is what Telegram does when the FSB
| knocks. By all means, give your potentially embarassing message
| content to a hostile nation's intelligence service.
| nicce wrote:
| That is a lot of speculation. If you read the encryption
| protocol, actual methods being used for encryption are well
| known. Client is open source and supports reproducible
| builds. If there is a backdoor, it is in front of our eyes.
|
| > What that list doesn't show is what Telegram does when the
| FSB knocks. By all means, give your potentially embarassing
| message content to a hostile nation's intelligence service.
|
| Telegram is in a lot of trouble in operating in Russia. It
| was blocked for two years. [1]
|
| If they are so co-operative, why pass the opportunity to
| watch on their own people. Or did they become co-operative
| after unblock? It seems, that they help on some level [2],
| but does this threaten to other countries? Hard to say so.
|
| [1] https://en.wikipedia.org/wiki/Blocking_Telegram_in_Russia
|
| [2] https://www.independent.co.uk/news/world/europe/telegram-
| rus...
| beervirus wrote:
| What about regular text messages?
| bonestamp2 wrote:
| They've been inside the phone companies for a long time so I
| assume they have full access to SMS.
| markab21 wrote:
| Assume anything sent over a cellular network carrier via normal
| SMS can not only be retrieved, but intercepted.
| brink wrote:
| Thank goodness a lot of companies regularly use it for 2FA.
| stunt wrote:
| Telecommunication is highly regulated. They have to keep
| records for a long time and make them available to law
| enforcement.
| tag2103 wrote:
| If I remember correctly standard SMS has no security on it at
| all and is in the clear during transit. I may be wrong and
| never scared of being corrected.
| no_time wrote:
| LINE,telegram,threema and WeChat are not even american companies.
| Can't they just tell the FBI to suck a fat one when they ask for
| user data?
| er4hn wrote:
| Telegram, as I understand it, can access your messages when
| stored in their Cloud[1]. They just make a choice to not
| provide the content of those to anyone.
|
| [1] - https://telegram.org/privacy#4-1-storing-data
| colechristensen wrote:
| Not if they want to operate in the United States or have access
| to our banking system.
|
| You don't get to pick your jurisdiction and then operate
| globally. You're obligated to follow the laws where you want to
| operate.
| no_time wrote:
| I wonder how this affects nonprofits like Matrix/Element and
| Signal. What can they do with them? Gangstalk their
| developers? Coerce big tech to ban them from their appstores?
| viro wrote:
| Refusing a valid court issued search warrant/order is a
| criminal offense. I think 180 days for each refusal of a
| legal order.
| colechristensen wrote:
| Well signal does not have the data, they comply with such
| orders with the tiny amount of metadata they have (like a
| timestamp of when your account was created and that's
| about it)
| no_time wrote:
| The issue is a bit more complex. I was thinking more on
| the lines of "will I get bothered for making crypto
| available for the masses that nobody can crack?"
| millzlane wrote:
| IRRC Didn't they jail a guy for that?
| [deleted]
| smoldesu wrote:
| The design of these decentralized/federated platforms is
| specifically so they _can 't_ easily coerce their owners
| into disclosing incriminating information. In some sense,
| it's similar to how Bittorrent implicates it's users.
| ev1 wrote:
| Doesn't Signal's dev already get bothered every single time
| they travel?
| xxpor wrote:
| Fwiw if you want to do any sort of FX whatsoever or accept
| credit cards, you need access to the American banking system.
| Decabytes wrote:
| Depends on whether the countries these companies exist in have
| agreements with the U.S for surveillance and stuff
___________________________________________________________________
(page generated 2021-11-30 23:00 UTC)