[HN Gopher] Google's Nest will provide data to police without a ...
___________________________________________________________________
Google's Nest will provide data to police without a warrant
Author : mikece
Score : 450 points
Date : 2022-07-27 13:59 UTC (9 hours ago)
(HTM) web link (petapixel.com)
(TXT) w3m dump (petapixel.com)
| igetspam wrote:
| For this reason, I just cancelled my nest doorbell order. I will
| gladly share access with the police if the situation warrants it
| but I don't accept that Google (or any company) should be able to
| make that decision without my consent.
| ransom1538 wrote:
| I only use these types of cameras outside where neighbors are
| recording anyway. In home, I use internal networks.
| unixbane wrote:
| I rented a house 2 years ago and it had some cloud trash with
| camera and mic in it. I said I will move out unless they remove
| it. They simply removed it as it was there only to appease
| prospective tenants.
| FollowingTheDao wrote:
| But will you stop using all of Google's products?
|
| "Google will allow law enforcement to access data from its Nest
| products -- or theoretically any other data you store with
| Google -- without a warrant."
| fooey wrote:
| My understanding is under US law, any data you allow a 3rd
| party to "see" (aka "host") is considered public and they
| don't need a warrant
|
| Police don't need a warrant to get into your gmail or icloud
| or whatever else you host with 3rd parties
| xdennis wrote:
| The data isn't public. It belongs to Google. They can ask
| for a warrant or just give away "your" data willy nilly.
| Grazester wrote:
| and from my understanding this is incorrect.
| sennight wrote:
| and from my understanding of the ECPA, opened email on a
| third party server is legally considered abandoned
| property and enjoys all the protection that a smashed
| washing machine next to the highway receives. After 180
| days all bets are off, regardless of the message being
| previously read or not.
| digitalsushi wrote:
| Wait a second, do I need to panic that the police can thumb
| through my gmail, or not?
| paulcole wrote:
| Need to panic? No. Can choose to panic? Of course.
|
| We choose to ignore risks all the time. We also choose to
| panic about stuff all the time. You're in control of what
| goes into what pile.
| fezfight wrote:
| Yes. Do you remember snowden and prism? Not just the
| police.
| ren_engineer wrote:
| 1984 was overly optimistic about people, government didn't even
| need to enforce putting spying devices in homes. Instead a huge
| chunk opted in voluntarily with doorbell cameras, Alexa, and
| other smart devices
| thescriptkiddie wrote:
| Comparisons to 1984 are hack, but I seem to remember that in
| the book the telescreens were described as something that
| people willingly bought.
| akira2501 wrote:
| There's absolutely nothing wrong with the technology, and it
| obviously makes peoples lives better to have it. I think the
| issue is that there are only a handful of vendors that
| happily operate like the monopolies they are and provide you
| with zero differentiation or choice within the market.
|
| The government isn't particularly interested in ending this
| problem either, I suspect this is due to a combination of
| industry capture and intelligence agency interest in these
| products.
| marcosdumay wrote:
| > There's absolutely nothing wrong with the technology
|
| Oh, but there is. It's subservient to the manufacturer, not
| to you.
|
| I'm still annoyed that no country decided to classify
| "selling" things where the manufacturer keeps complete
| control and denies you access as fraud. But just because no
| legal system decided it's a crime, it doesn't mean it's
| right.
| phpisthebest wrote:
| You say there is nothing wrong, but then go on to list
| things that are in fact wrong.
|
| You think the problems are a mistake or otherwise something
| to be "fixed" the products are working exactly as both the
| government and the manufacturers want them to, and it has
| nothing to do with "intel agencies"
|
| Having your entire life "cloud connected" and them
| complaining about privacy, is like opening a window then
| complaining that the house is drafty.
|
| I love home automation, not a single component of my home
| automation is cloud connected, if more people would accept,
| learn and support non-cloud systems, services and protocols
| everyone would be better off
| Dylan16807 wrote:
| > You say there is nothing wrong, but then go on to list
| things that are in fact wrong.
|
| They say there is nothing wrong with the technology, but
| then go on to list things that are in fact wrong with the
| not-technology.
| phpisthebest wrote:
| No they ignore the problems with technology ( unencrypted
| cloud connection / data storage, or lack of zero
| knowledge systems ) and move the problem to political or
| other realms.
|
| The problem with the technology is it allows the company,
| political power, or police to access the data with out
| user permission, that is a technological problem, belief
| that just passing the correct laws to resolve this
| technology problem defy's recorded history and logic
| PeterStuer wrote:
| 1984 and Brave New World weren't exclusive. We can have both
| at the same time.
| dehrmann wrote:
| I like how much of Brave New World is being created in
| SoMa.
| zappo2938 wrote:
| In the West, Orwell was wrong and Huxley was correct.
| shadowgovt wrote:
| I think Larry Brin's "The Transparent Society" is the best
| read on the topic. Not predictive of all outcomes, but many
| aspects of modern surveillance he did see coming.
| kevin_thibedeau wrote:
| His novel 'Earth' also predicted glassholes.
| mftb wrote:
| This is confused. The Transparent Society is by David
| Brin[0]. Two of the founders of Google are Larry Page and
| Sergey Brin. The confusion is understandable given the name
| collision.
|
| [0]https://en.wikipedia.org/wiki/The_Transparent_Society
| shadowgovt wrote:
| Thank you; good catch on the name error.
| themaninthedark wrote:
| Fahrenheit 451 has a part where they get the entire city to
| go to their doors to try and spot a fugitive, this action is
| coordinated by the radios that everyone wears.
|
| With these cameras and recognition algorithms, you don't even
| need people to go to the doors. Just pull the feeds.
| isaacremuant wrote:
| This is not limited to Alexa or seemingly unnecessary tech
| gadgets.
|
| This includes ALL your data. Gmail, Google, Android.
|
| So unless you're opting for iOs (provided they're not doing
| the same as Google here) and not using Gmail or Google you're
| still falling under "Surveilled by the gov via tech company
| who serves their interests and not yours, even if you pay
| money for their services".
| yaomtc wrote:
| OR you can opt for a version of Android without the
| proprietary bits like GrapheneOS.
| redblacktree wrote:
| How accessible is that to your grandmother? Can you walk
| her through the process? (Are you willing to, given that
| now you'll be called if anything goes wrong?)
| [deleted]
| bushbaba wrote:
| This is why Apple is going to continue to crush it. They
| can build a subpar equivalent to google services and a
| large portion of the tech world will adopt it for its
| privacy benefits. The 'tech world' is a large influencer to
| the general public, leading to further penetration.
|
| It's a great strategy as Google makes 80%+ of it's revenue
| from Ads, and their ad model does not work well with a
| privacy first mindset.
|
| Apple designs with privacy in mind. Google designs with
| invasiveness.
| monklu wrote:
| Apple is the only one of the "big tech" to actually
| operate a datacenter in China, whose contents are
| entirely subjected to the whims of the regime.
|
| I'm afraid your conviction in Apple is the product of a
| well-crafted fantasy by their marketing department,
| instead of based on some deep rooted philosophical belief
| regarding the rights and privacy of their users.
| Tijdreiziger wrote:
| > Abstract -- We investigate what data iOS on an iPhone
| shares with Apple and what data Google Android on a Pixel
| phone shares with Google. We find that even when
| minimally configured and the handset is idle both iOS and
| Google Android share data with Apple/Google on average
| every 4.5 mins. The phone IMEI, hardware serial number,
| SIM serial number and IMSI, handset phone number etc are
| shared with Apple and Google. Both iOS and Google Android
| transmit telemetry, despite the user explicitly opting
| out of this. When a SIM is inserted both iOS and Google
| Android send details to Apple/Google. iOS sends the MAC
| addresses of nearby devices, e.g. other handsets and the
| home gateway, to Apple together with their GPS location.
| Users have no opt out from this and currently there are
| few, if any, realistic options for preventing this data
| sharing.
|
| Leith, D. J. (2021). Mobile Handset Privacy: Measuring
| The Data iOS and Android Send to Apple And Google. URL:
| https://www.scss.tcd.ie/doug.leith/apple_google.pdf
| forestwisp wrote:
| > Apple designs with privacy in mind.
|
| I find that hard to believe when so many of their
| devices' functionality depends on you sending data over
| to them. Unless you go out of your way to make sure
| you're blocking all your devices from phoning home or
| sending any data over to Apple, then any supposed privacy
| benefit becomes a lie.
|
| Either you're the only owner of your data or your data is
| not, by definition, private to you.
| wyre wrote:
| I think this dogmatic take is pretty useless. For the
| vast majority of people icloud is a QoL increase that is
| worth not "owning" our data. All Apple does with our data
| is keep it safe (CSAM not included), but I would love to
| be proven wrong.
| uoaei wrote:
| I don't think using the correct terms to avoid ambiguity
| is "dogma". It would be "pedantic" except the concern
| regards pretty much the essence of the issue, so that
| doesn't apply either.
| wyre wrote:
| I think your being pedantic. Dogma is fine usage. They
| argued that Apple isn't a privacy-focused company because
| they store our data on their servers, while the bar for
| companies to protect our data is so low it's a tripping
| hazard.
| smoldesu wrote:
| I think the larger problem is that Apple treats privacy
| as a double standard. They assist China's government in
| mass-surveillance of their citizens, while simultaneously
| airing "Privacy is a human right" ads in the United
| States. Once you factor in the horrific irony of Uighur
| slave labor building iPhones, I think it's pretty easy to
| understand how people can call their efforts 'security
| theater'.
| lapetitejort wrote:
| It should be noted that Apple also has an ad wing that
| brought in $4 billion in 2021 [0].
|
| 0: https://www.barrons.com/articles/apple-revenue-
| ads-516478777...
| bushbaba wrote:
| The ad revenue is mostly on App Store searches, where the
| intent is provided during the request.
|
| That Ads devision can operate in a privacy first mindset
| while still seeing huge growth YoY. Where-as if Google
| operated in a don't be evil/privacy first mindset, they'd
| loose substantial revenue and likely no longer be
| profitable.
| Aeolun wrote:
| I mean, you carry a listening device with you almost 100% of
| the time. Why would you even worry about the Alexa in your
| home?
| goodpoint wrote:
| That's a fallacy. The average "listening device" it's not
| constantly recording and uploading audio. We would notice
| if it did.
| DougWebb wrote:
| For me, the difference is that the phone (with voice assist
| turned OFF) is not supposed to be listening all the time,
| while a device like Alexa _is_ supposed to be listening. I
| don 't want devices listening so I turn that feature off
| when I can and avoid the device when I can't.
|
| Is the phone listening anyway? Maybe, but that violates a
| privacy expectation, and there may be recourse if someone
| discovers it's doing that.
| bryananderson wrote:
| I work on Alexa and for whatever it's worth, I can
| confirm that Amazon is telling the truth about how Alexa
| listens and about what is done with your data.
|
| This is all publicly available info, and perhaps there's
| no reason why you should trust me any more than you trust
| Amazon as a company, but as one privacy-conscious
| engineer to another, I promise that your ambient
| conversations are not being stored or sent to Amazon and
| that any data you delete in the app (either by specifying
| an auto-delete period or manually deleting it) is
| actually, really, truly deleted.
|
| A process running locally on your Alexa device listens
| for the "wake word".[0] This audio is only processed
| locally within a constantly-overwritten memory buffer, it
| is neither stored nor transmitted. Only once the wake
| word is detected does Alexa begin to transmit an
| utterance to the cloud for processing. I've worked with
| the device stack and it really isn't transmitting your
| ambient conversations.
|
| Within the Alexa app[1], you can see and hear all of your
| stored data and can delete any of it. You can also
| control the duration after which it is auto-deleted. From
| working with ASR datasets, I can confirm that deleted
| audio (and the associated text transcript) is really
| deleted, not just hidden from your view.
|
| I never owned an Alexa or other smart home device before
| I worked on it, and I'm not sure I'd buy another
| company's device where I lack the same ability to "trust
| but verify", so I'm not sure how much weight my word
| should carry. But I can give my word that Alexa is not
| transmitting your ambient conversations or just setting
| "deleted=true" in a database when you tell it to delete
| your data.
|
| [0] see page 4 of https://d1.awsstatic.com/product-
| marketing/A4B/White%20Paper...
|
| [1] https://www.amazon.com/b/?node=23608614011
| ARandomerDude wrote:
| I definitely understand your point, but I think the greater
| issue is why should this have to serve as a rationalization
| in the first place? Why can't we expect our phones to serve
| us rather than the other way around?
| arberx wrote:
| It's hilarious. Freedom/privacy-loving Americans voluntarily
| give it up for minor gains in comfort.
| lijogdfljk wrote:
| Eh, it's exactly what you expect from America though. Ie
| the embodiment of short term thinking. Economy,
| environment, politics, etc - not that America is entirely
| unique here, just that the population seems to embrace this
| as a foundation in my experience.
|
| Privacy to tech like this is very hypothetical till it
| happens, and it'll rarely happen. If it's not in our faces
| we won't vote against it.
| ziddoap wrote:
| > _Eh, it 's exactly what you expect from America though.
| Ie the embodiment of short term thinking._
|
| I think this is the entirely wrong framing. My other
| comment covers some of it, but specifically in regards to
| your comment: it's a lack of education, not the
| embodiment of short-term thinking.
|
| And really, we can't expect every person that uses Google
| (or whatever other large tech company) to thoroughly
| understand all of the bits and pieces of technology that
| could be used to fuck them. Or how things that we've been
| told are anonymous/private become non-anonymous/non-
| private when combined with other sparse data. These are
| complex topics that even many technologists don't
| understand (or are outside of their field of expertise).
|
| These companies hire top lawyers to write complex ToS,
| use as many dark-patterns as legally possible, do illegal
| things until they get caught doing so, evolve their terms
| frequently, etc. Yet somehow they've convinced everyone
| to blame the layperson. It's remarkable, really.
|
| What would be really swell is if we could, you know, not
| have companies spend millions of dollars on how-to-fuck-
| your-user initiatives.
| titzer wrote:
| > it's a lack of education
|
| This is absolutely by design and part of a larger pattern
| of propaganda that keeps Americans scared of the
| government and in love with the idea of becoming
| billionaire CEOs themselves because it's "moral". That
| holy "free market" has rewarded those rich people for
| being some damn smart and efficient--they deserve it, not
| the damn communist free loader leftists who hate America.
| bumby wrote:
| "Capitalism, God's way of determining who is smart and
| who is poor."
|
| -Ron Swanson
|
| /s, obviously
| lijogdfljk wrote:
| But we can't live in a world where the responsibility
| _isn 't_ on the individual, can we?
|
| Ie if we expect corporations to not fuck you over, who is
| there to enforce that? Who has the power to keep them in
| check? Okay, maybe Government should hold that role - but
| who then keeps the government in check? Who ensures that
| the spying or privacy from the Government is kept in
| check? etc
|
| Ultimately the buck always stops at the individual. And
| we have to be hyper aware of long term implications,
| because money, greed and power has deep, deep pockets (as
| you also mentioned) and the fight will be never ending.
|
| We, as a community, have de-propritized education, health
| care, public safety, privacy, etc. Sure, powerful forces
| have been pushing for that exact thing, but we can't
| expect them to "just be nice" or w/e.
|
| I'm very pro "Big Government". However my ideas behind
| big government will not work without individual
| responsibility. Until then citizens are purposefully and
| willfully giving their power away with every tiny step.
| The blame is on us, and our current state is inevitable.
| My 2c.
| ziddoap wrote:
| My last sentence was more wishful thinking than a
| proposed solution. I am obviously aware the world isn't
| as utopic as the sentence would require.
|
| The main point I wanted to get across is that it's
| baffling that companies aren't blamed in these
| conversations. It's always the user who is blamed ("well
| you read the ToS didn't you!"). And that's dumb, because
| the vast majority of users aren't lawyers and don't have
| CS degrees -- both of which are becoming increasingly
| required to provide _informed_ consent to a ToS. (edit:
| in every other contract I sign, a lack of _informed_
| consent is grounds to void the contract, exception being
| tech-company ToS contracts)
|
| If you _still_ want to blame my 85-year old parent for
| not understanding what Google is doing with his data, go
| for it, I guess. Just seems stupid to do so, because he
| barely can open up a web browser but is somehow expected
| to understand the complexities of data aggregation and
| what impact it will have on him. And as time marches on,
| it 's equally ridiculous to suggest that he just never
| use a computer to avoid the issue.
|
| > _And we have to be hyper aware of long term
| implications,_
|
| Without post-secondary education in niche fields, this is
| becoming impossible. Especially across multiple services
| with changing terms, in countries with changing laws, in
| a world where technology evolution outpaces curriculum
| changes.
| lijogdfljk wrote:
| > Without post-secondary education in niche fields, this
| is becoming impossible. Especially across multiple
| services with changing terms, in countries with changing
| laws, in a world where technology evolution outpaces
| curriculum changes.
|
| I agree, but again i go back to, "but how else can it
| work"?
|
| Of course i don't expect everyone to be knowledgeable on
| all low level systems. However, to the point of your 85
| year old grandma, she is a tiny demographic in a much
| larger, much more reasonably informed demographic who
| also completely ignore the implications.
|
| Name a demographic that isn't wildly ignorant of things
| that are reasonable to know?
|
| But again, i repeatedly fallback to "But who else can do
| this?". This is why i'm pro Government, but not until
| people start pushing for responsibility on this front. It
| may not be reasonable for your grandma to be responsible
| for Google Data stuff, but she _(and the rest of us)_
| have sat around for dozens of years watching authority
| figures have little to no accountability or oversight.
|
| The issue isn't about Google. The issue is about us, and
| our inability to build a government and authority system
| that is in-line with our views. We hand our power over
| with no thought or oversight and then we're shocked when
| it all comes back against us. This has nothing to do with
| Google or CS, imo.
| Dylan16807 wrote:
| With big companies, it's often the case that we're _all_
| the 85 year old grandma.
| lijogdfljk wrote:
| Agreed, and we do nothing to fight that. We're all
| complacent with it. Hell, not only did we not fight it,
| ie we didn't push for government control and oversight,
| but we signed up. We let them in and laid out welcome
| platters.
|
| This isn't about being informed on obscure topics. As i
| said this has nothing to do with Google. It's about our
| willingness to fight for a government that can handle
| this, and fight to control said government.
| bumby wrote:
| _I agree, but again i go back to, "but how else can it
| work"?... Name a demographic that isn't wildly ignorant
| of things that are reasonable to know?_
|
| Who defines "reasonable"?
|
| When you get delayed on a flight due to a maintenance
| issue, are you equipped to determine if that delay was
| reasonable? Most likely not, although many mechanically
| inclined people may be in a position to make that call.
| Those same people may not be in a position to arbiter the
| reasonableness of Google's ToS (side-stepping the whole
| obfuscation of details that was previously covered).
|
| When society gets reasonably complex, we out-source those
| decisions. In the example of the aircraft, we have a
| regulatory body who makes the rules about what is
| reasonable. It wasn't always like that, of course, but
| the need grew out of the growing complexity and risk
| profile. So to your question and an earlier point, there
| may be room for regulatory bodies as an alternative for
| "how else can it work?".
| ziddoap wrote:
| > _much more reasonably informed demographic_
|
| My argument is that the "reasonably informed demographic"
| is _incredibly_ small. I can only say the same thing so
| many times, though, so I 'm not sure how to explain it in
| a different way.
|
| To restate my example, even very smart CS graduates may
| not realize that _anonymized data_ joined with other
| _anonymized data_ can result in _de-anonymized data_ ,
| because the linking and de-anonymization of sparse
| datasets is a niche subfield that has only recently begun
| being explored.
|
| Many people may _think_ they are reasonably informed
| (they look into the ToS, see that data is anonymized, and
| decide that they are okay with that) without knowing that
| the data may later by de-anonymized through advanced
| statistical analysis they 've never been exposed to in
| all their schooling. So while they _thought_ they were
| informed, they weren 't. This repeats across several
| domains.
|
| > _But again, i repeatedly fallback to "But who else can
| do this?". _
|
| Why is that when a problem is identified, people demand a
| solution be provided at the same time? I don't have a
| solution, sorry. But that shouldn't preclude me from
| identifying a problem.
|
| I honestly did not expect saying basically "Let's put
| some of the blame on Google, because they're the ones
| with the dark patterns and lawyers and experts, rather
| than solely blaming the layperson" would be met with much
| pushback.
| lijogdfljk wrote:
| > My argument is that the "reasonably informed
| demographic" is incredibly small. I can only say the same
| thing so many times, though, so I'm not sure how to
| explain it in a different way.
|
| I think we're in agreement here. To be clear, i'm mostly
| talking about intent, an attempt to stay informed and a
| willingness to act - to push for centralized leadership
| who _is_ informed.
|
| Ie as i said before, your grandma is not expected to know
| this. She is expected to fight for a government that will
| be, and that will also be able to be held accountable.
|
| We have neither the oversight on government current, nor
| the willingness to act. Your grandma built the same world
| we are building today. One of inaction and obfuscation.
|
| If society cannot be informed and active on what is
| essential to build that world (whatever that may be),
| then we are doomed. Currently, the population at large is
| not. At least, not from what i can see in action.
| ziddoap wrote:
| Google's ToS is 16 pages with what appears to be about 50+
| hyperlinks, including several hyperlinks to "additional
| service-specific terms" which itself has ~50 links to
| _other_ terms which are all multiple pages.
|
| Perhaps instead of pinning all of the blame on users, we
| could have the companies producing labyrinthian ToS
| contracts written by top-grade lawyers and full of legalese
| (that no layperson should be expected to understand)
| shoulder at least _some_ of the blame?
|
| This doesn't even touch on the fact that many topics (as
| related to data aggregation and privacy) are highly
| technical and require at least a few years of post-
| secondary to even begin wrapping your head around (e.g. de-
| anonymization via large sparse datasets is not something I
| can reasonably teach my 85-year old parent, nor to my
| child, both of which use Google services in some capacity).
|
| But, yes... Let's blame it on Average Joe, who just wants
| to watch their dog for a few minutes while at work and saw
| an ad on TV about a convenient way to do so. Shame on them
| for not being both a lawyer and a CS graduate.
| mstipetic wrote:
| I don't understand why aren't there any standard terms of
| service which are generally applicable and companies can
| make minor adjustments to them if they can justify it
| NaturalPhallacy wrote:
| Because their in house lawyers tell them they need a
| custom made one.
|
| "It is difficult to get a man to understand something,
| when his salary depends on his not understanding it." --
| Upton Sinclair
| hermitdev wrote:
| More like "If you're not part of the solution, there's
| money to be made in prolonging the problem." (I don't
| know who said it, but I'm paraphrasing from something
| I've seen on a demotivational poster re: consulting)
| mywittyname wrote:
| A solution to this is for courts to limit what is
| applicable in a ToS to a certain number of words, and
| have overly broad statements always favor the entity who
| has to agree.
|
| This, in effect, nullifies all but the most important
| components of a Tos.
| uoaei wrote:
| Due diligence is expected among a mature population. But
| you're right it's not entirely on individuals. There
| should be ways to disseminate information about the
| threats these products pose to personal liberty,
| especially in a nation that uses the word "liberty" so
| freely in its foundational documents.
| ziddoap wrote:
| > _Due diligence is expected among a mature population._
|
| I wholly agree.
|
| But we're quickly approaching (and in some cases, past)
| the point where proper due diligence requires a 4-year
| post-secondary education in a related CS field, if not
| more.
|
| We're talking about products that take multiple domain
| experts several years of collaboration to create. How is
| it reasonable to expect my mechanic, accountant, etc. to
| do their due diligence on how that product processes
| their data, especially when it's processed in a black-box
| created by several other domain experts, and their only
| source of information is purposefully opaque terms
| written by lawyers?
| uoaei wrote:
| > proper due diligence requires a 4-year post-secondary
| education
|
| I don't think that's the case here or indeed very
| commonly. You don't necessarily have to understand
| implementation details if some core tenet of popular
| ethics is being violated. One key feature of the domain
| -- namely that _you don 't own "your data" and so you
| don't get to decide what happens with it_ -- is pretty
| clearly in violation of principles that the vast majority
| of Westerners would at least profess to hold. Beyond the
| motivating principle that third parties should be
| required to receive explicit whitelist access to use
| privately-owned data, "implementation details" refers
| mostly to policy and enforcement, not really
| technologies.
| smhenderson wrote:
| _It 's hilarious._
|
| That's an odd take, I honestly don't find anything about
| this article, or the broader topic of privacy and overreach
| by companies and law enforcement, amusing in any way.
| solardev wrote:
| > if the situation warrants it
|
| ...I see what you did there
| shadowgovt wrote:
| The funny thing is I'm of the opposite mind.
|
| "Convenience _and_ if I go missing mysteriously Google will
| hand info over to the people trying to find me? Win-win. "
| xdennis wrote:
| Would it be so difficult for Google to add a configuration
| step where people can explicitly consent to this?
| shadowgovt wrote:
| The short answer is "yes," but I would be in favor of it
| being added. Of course, it's not going to secure your data
| if somebody that isn't you is the one under threat of
| imminent harm or death and Google believes the data you are
| holding could save their life.
| slingnow wrote:
| You're right! That hypothetical and highly unlikely scenario
| is completely worth all of the ways in which the tech could
| be and is being abused.
| shadowgovt wrote:
| If it's so hypothetical and highly unlikely, then we've
| nothing to concern ourselves with over the fact that Google
| will comply with attempts to rescue these hypothetical and
| highly unlikely endangered people. After all, if it
| basically never happens, Google will basically never grant
| the extrajudicial request.
| progman32 wrote:
| That's fine, as long as we're clear that is said in a
| position of privilege.
| [deleted]
| [deleted]
| cronix wrote:
| Now if only you could get your neighbors to quit too. The ones
| that have it are all still recording you. You really only need
| one or two per block to monitor everyone on the street. ML
| should easily be able to work out "Silver Corolla with license
| plate ABC123 left 1200 Sunnyside St at 4:32pm and headed East."
| Then another one 3 blocks away can report the same car turning
| North on 127th, and then another can report...
| stilldavid wrote:
| Or live in a metro area with WAMI already in place...
| hammock wrote:
| Don't live so close to your neighbors? Easier said than done,
| but on the long scale...
| eitally wrote:
| To be fair, though, there's a flip side. My neighbors seem to
| get more utility from my driveway feed than I do. Every
| couple of months, I get a text from a neighbor asking if I
| got footage of some such thing. Everybody knows who has
| cameras, and those people are invaluable whenever something
| nefarious happens (mail theft, break in, kids running amok,
| etc).
| lotsofpulp wrote:
| This is much easier (and already accomplished) by the
| government installing license plate readers.
| TuringNYC wrote:
| It is even easier by the government using EZPass and
| electronic toll readers. No optics or OCR required.
| bushbaba wrote:
| I'm kind of amazed the government doesn't embed a RAIN
| RFID sticker in renewal tags. They can be read ~30ft away
| and cost just a few cents per.
| kloch wrote:
| Oh that's definitely on their mind:
|
| https://stevenmcollins.com/auto-license-plaste-to-have-
| rfidt...
|
| https://www.nbc12.com/story/20500107/virginia-lawmakers-
| want...
| kube-system wrote:
| license plates have a bit more adoption, though.
| jollyllama wrote:
| Which is why I will never have one.
| Karunamon wrote:
| Who needs the hassle and appropriations process when you
| can just send a friendly email to the country's biggest
| data broker and they'll give you whatever you want?
| JoshGlazebrook wrote:
| That would assume that nest and ring cameras are even capable
| of making out license plates. They're all crappy sensors with
| horrible night vision.
| MichaelCollins wrote:
| So you're saying it will only be a few more years then?
| Look at where cellphone cameras were 20 years ago, and
| where they are today. The present inadequacy of hardware
| doesn't give me any comfort.
| sparkbII wrote:
| Voeid wrote:
| Note that this is already in-line with their policy, you've
| already agreed to hand over your data and allowed them to share
| it, so I don't understand why this single article suddenly
| changed your mind.
| jjulius wrote:
| >... so I don't understand why this single article suddenly
| changed your mind.
|
| Prior ignorance of said policy (which is standard for most
| users), for one.
| lynndotpy wrote:
| Is this tucked away in the Terms/Privacy policy? If so, it
| might be because they did not read the policy.
| [deleted]
| TameAntelope wrote:
| This just seems extreme to me, considering how you've likely
| never been in this scenario and will in all likelihood not ever
| be in a scenario where you wouldn't give the police this
| information but Google would (e.g. your own home being robbed
| or a masked stranger ringing your doorbell).
|
| Sometimes I think people covet privacy for its own sake, and
| don't think about the practicalities. The whole point of living
| in a collective society is that we give up some freedoms for
| the sake of overall increased prosperity, that's always been
| the tradeoff, and this is just one of those tradeoffs.
| jmkb wrote:
| In the vein of principals, yes, privacy for its own sake is
| valuable to me.
|
| In the vein of practicalities, both Google and the justice
| system (USA for me) are monstrously large bureaucracies known
| to make difficult-to-redress errors. Google's capricious
| account banning, police raiding incorrect addresses, eg. The
| decision to share with them more information than the law
| requires is one I'd prefer to make myself.
| TameAntelope wrote:
| And I think this view is irrational. Privacy for its own
| sake is effectively hoarding, and as you clearly show,
| hoarding can be caused by fear, which you have for Google
| and the justice system.
|
| A numerate person would know how rare these things you're
| afraid of are, and not let those fears drive how they live.
| I (hopefully) follow that path, and I recommend you check
| it out!
|
| I read what you wrote in the same way I suspect you would
| read someone who is afraid of space because meteors have
| killed people (as a rough example).
|
| It just doesn't seem like the rates at which the things
| you're worried about are happening in a volume that would
| actually matter to a society.
| trasz wrote:
| >bureaucracies known to make difficult-to-redress errors
|
| Or just plain out refusing to fix errors where they would
| be relatively easy to fix; compare Scalia's "it's fine to
| fry a provably innocent person as long as the procedures
| are followed" argument.
| fleetwoodsnack wrote:
| Louis Brandeis on the right to privacy:
|
| https://louisville.edu/law/library/special-
| collections/the-l...
| TameAntelope wrote:
| Yeah, like I said, you give up some of your rights in order
| for a prosperous society to exist.
|
| I'm not denying the right exists, I'm saying we give up our
| other rights all the time for the benefit of society, why
| is privacy any different?
|
| Further, it's a spectrum. You're not putting a camera up
| for police to peruse at their leisure, it's only in
| specific situations.
| [deleted]
| azlev wrote:
| You are not giving up privacy to a prosperous society,
| you are just giving up privacy.
|
| I couldn't find evidence that mass surveillance is good a
| society.
| nullfield wrote:
| What was the discussion I remember seeing long ago, about
| two kinds of surveillance-in-society?:
|
| Kinda-good, 1, so and so can just go check the camera
| that points at the central plaza fountain that anyone can
| access, and sees that his spouse has arrived and is
| waiting for him as agreed.
|
| or Bad, 2, cameras all over that everyone has no idea who
| controls, watches, and/or is recording
| judge2020 wrote:
| 'prosperous society' ~= convenience, less human hours
| wasted on boring stuff. The convenience of a video
| doorbell and connected home sure seem worth it to me.
| Karunamon wrote:
| And how are either of those things substantially impacted
| if Google's policy was "we do not hand over user data
| without a warrant"?
| judge2020 wrote:
| I'm not saying they are, just that consumers are giving
| up privacy for some sort of return. It's not required, as
| there are E2E HomeKit alternatives, but it's inaccurate
| that 'all you do is give up privacy'.
| fleetwoodsnack wrote:
| Agree to disagree. Systems that frustrate the
| accumulation and concentration of power seem to be
| integral to a functional society, nevermind a prosperous
| one, historically speaking.
|
| "Convenience, less human hours wasted on boring stuff" is
| fine as an individual consumer mindset, but does not form
| sufficient criteria for evaluating complex social
| systems.
| azlev wrote:
| You can have all of this without giving up data.
| [deleted]
| kevin_thibedeau wrote:
| That attitude is the start of a slippery slope. If the
| end always justifies the means then none of your freedoms
| will be protected if someone else decides it's more
| convenient for you to not have them. This is the major
| problem with the big government authoritarianism that has
| infected the republican party.
| TameAntelope wrote:
| Slippery slope arguments are a fallacy. If something bad
| happens, or is proposed, we can address it when the bad
| thing happens or is proposed. Nothing actually bad is
| happening or is proposed here.
| NoGravitas wrote:
| As noted above, the trade-off our society has chosen to
| make is search warrants. Otherwise, it might as well be
| "for the police to peruse at their leisure". Is Google
| going to rigorously vet every "emergency" request for
| data the police make?
| jakelazaroff wrote:
| We already have that tradeoff. It's called a warrant. If the
| police get one, you are forced to give them access to your
| otherwise-private affects.
|
| This is a step beyond that. Warrants are granted at the
| discretion of a judge, the bar is high, the scope is narrow
| and you (theoretically) have recourse if it's abused. Here,
| the discretion is Google's, the bar is nonexistent, the scope
| is unlimited and you have zero recourse if you think you've
| been wronged.
|
| This wouldn't be an issue if people trusted Google or the
| police. But they don't, and it's pretty easy to imagine ways
| in which this could be abused to harm people.
|
| Let's say you live in Texas and get abortion pills in the
| mail. If the police have a warrant to search your house for
| something unrelated, they (theoretically) can't see the pills
| and decide to charge you with an unlawful abortion (unless
| they were "in plain view", etc). But if Google gives police
| access to footage of your house extrajudicially, _police can
| use anything they see as evidence against you_. And make no
| mistake -- things like that _will_ happen as a result of this
| policy.
| TameAntelope wrote:
| I think you're taking this way further than anyone actually
| involved would. _IF_ what you 're saying ever did even come
| close to occurring, we both know Google would shut it down
| quickly. Not just because it's horrible, but because it's
| also bad for business, and they've shown a propensity to
| protect data when it would be used as you hypothesize here.
|
| Google is smart enough to know that "snitching" on its
| users is bad for business.
|
| Think "track a burglar as he moves through a neighborhood"
| not "snoop (illegally) on the contents of people's mail".
| jakelazaroff wrote:
| _> Think "track a burglar as he moves through a
| neighborhood" not "snoop (illegally) on the contents of
| people's mail"._
|
| It's not illegal, though. Google is (presumably) fully
| legally in the clear to just hand over footage to police.
| That means that, if Google decides to hand over your
| footage to the police, anything on tape can be used as
| evidence against you. And Nest offers indoor security
| cameras, so your entire house could be fair game.
|
| _> Google is smart enough to know that "snitching" on
| its users is bad for business._
|
| Is it bad for business? That's not clear. Your whole
| argument is that this is fine so long as you think the
| likelihood of abuse is low. My guess is that it actually
| won't hurt Google's business at all, even as we start to
| discover police misusing this.
|
| "It only happens to a handful of people! And anyway, they
| were [doing drugs/stealing/etc], so they deserved it.
| It'll never happen to _me_!"
| TameAntelope wrote:
| It's illegal to go through someone else's mail, and that
| was the hypothetical you proposed.
|
| My argument is not that it's fine as long as the
| likelihood for abuse is low, my argument is that it's
| fine as long as there hasn't been any actual abuse. When
| something _does_ happen, we can respond to it.
|
| Until then, it's not reasonable to go through a bunch of
| worst-case scenarios.
| jjkaczor wrote:
| "I think you're taking this way further than anyone
| actually involved would"
|
| Just because you currently seem to have a "failure of
| imagination" does not mean that law enforcement,
| corrupt/facist/dystopian government officials or even
| unscrupulous employees within the tech sector itself will
| not absure their observational powers now or in the
| future.
|
| Or maybe you just haven't been paying attention to the
| news for the last "n" decades.
| TameAntelope wrote:
| I have a failure of actual data of problems occuring.
|
| We all deal with "sky is falling" person on our
| development teams, and it's pointless doomsaying then as
| it's pointless doomsaying now.
|
| When the bad things happens, we can react. Until then,
| let's try to stay reasonable.
| LorenPechtel wrote:
| To me, a Nest or Ring is pointed at a basically public space.
| So what if the cops can look at the feed?
| fknorangesite wrote:
| Because I don't want to live in a surveillance dystopia where
| the state - or Google or Amazon - has a camera on the outside
| of every house.
| Dylan16807 wrote:
| It's bad for public spaces to be pervasively tracked.
| MichaelZuo wrote:
| Is there an internally consistent reason for why that's
| bad?
|
| Because every argument I've seen along those lines fall
| apart after drilling down a few layers.
| oigursh wrote:
| There's a video of a panel with Moxie Marlinspike who
| says something along the lines that there is value in
| being able to break the law. It's how society evolves.
| Complete surveillance means no broken laws, means a
| relatively static society.
| JumpCrisscross wrote:
| I am looking into SimpliSafe [1]. Anyone know if they store these
| data?
|
| [1] https://simplisafe.com/
| flyingfences wrote:
| I see them shilled in youtube sponsor deals often enough that I
| have to assume they're no good.
| kornhole wrote:
| I haven't looked into it, but someone else recommended
| https://github.com/blakeblackshear/frigate which integrates
| with Home Assistant.
| cabirum wrote:
| Any open-source projects to reliably stream and archive a video
| from a raspi?
| dole wrote:
| MotionEye?
| hntcz wrote:
| I did this for a university project. It streams to S3 (of Azure
| or Digital Ocean) encrypted, the client downloads it from
| there.
|
| It's very rough around the edges but if you are up to
| contribute it's a good base.
|
| https://github.com/tcz/choicecam-server
| https://github.com/tcz/choicecam-client
|
| The client is a React Native project, should run on Android or
| iOS.
| simmons wrote:
| For streaming, I have a few pi's (and cameras) running
| v4l2rtspserver, and it seems to work pretty well. It appears to
| be using the Broadcom hardware support for H.264 encoding, so
| it's pretty light on the CPU. I'm currently using Shinobi on a
| different host to receive and archive the RTSP streams.
| (Although I'm interested in checking out Frigate, which was
| mentioned elsewhere in this discussion.)
| nerdjon wrote:
| Another article pointed out that most other cloud based offerings
| will still provide it if there is a warrant.
|
| But it seems like Apple is the only cloud based offering that
| cannot actually respond to these requests even if they get one?
|
| I just checked my HomeKit Secure Video camera and it seems like
| it isn't just the default setting, but the only setting. I see no
| way to tell it to not encrypt the data when storing it in iCloud
| (not sure why they would give that option at all TBH)
| lalopalota wrote:
| Hackers gaining power of subpoena via fake "emergency data
| requests"
|
| https://news.ycombinator.com/item?id=30842757
| unixbane wrote:
| It's funny how things like this surprise people, as if the
| legal system is a trustworthy blob on their architecture graph.
| One fundamental thing about it is that it's made up of
| different people and a huge amount of people, all with
| differing levels of competence (so not only is it
| untrustworthy, but it can't even consistently do whatever its
| goals are) as opposed to having any coherent goal or ideology.
| [deleted]
| johncessna wrote:
| > "If we reasonably believe that we can prevent someone from
| dying or from suffering serious physical harm, we may provide
| information to a government agency -- for example, in the case of
| bomb threats, school shootings, kidnappings, suicide prevention,
| and missing person cases," reads Google's TOS page on government
| requests for user information. "We still consider these requests
| in light of applicable laws and our policies."
|
| This is in line with the reasons cops can do things that
| otherwise normally require a warrant. This is generally 'a good
| thing.' The issue, for me, is in the abuse and interpretation of
| these exemptions.
| sterlind wrote:
| Right, exactly. Google could have this and make it perfectly
| accountable. Just surface a user notification that X entity
| accessed your camera's data for Y time range, ideally with a
| reference to a case number. People will push back if the police
| abuse their power. They won't become as complacent, because
| they'll know when the cops are spying.
|
| But would they actually do it?
| freedomben wrote:
| > _"A provider like Google may disclose information to law
| enforcement without a subpoena or a warrant 'if the provider, in
| good faith, believes that an emergency involving danger of death
| or serious physical injury to any person requires disclosure
| without delay of communications relating to the emergency, '" a
| Nest spokesperson tells CNET._
|
| I sympathize with this dilemma. It's a real-life version of the
| Trolley Problem[1].
|
| [1]: https://en.wikipedia.org/wiki/Trolley_problem
| dafty4 wrote:
| But "to any person", e.g., a CIA asset in the field might be
| exposed by a gmail user, thus threatening her/his life, thus
| pre-emptively disclose the gmailer's info to law enforcement?
|
| Or imagine an idle threat is made off-hand in a private gmail,
| so pre-emptively disclose that threat to law enforcement?
|
| In the sci-fi Minority Report scenario, Google could use AI
| language models on all gmail to make a judgment call on
| "emergent danger of serious physical injury". If I plan to go
| bunjee jumping with an unreliable provider, Google scans my
| gmail and tells law enforcement?!?
| freedomben wrote:
| Yes good examples of how it can be abused. I'm certain the
| interpretation could be very liberal depending on the person
| making the decision.
|
| I know a person who was banned from twitter and was nearly
| fired because he dead-named somebody. The explanation was
| that it was literal violence against the person he dead-
| named, because so many trans people self harm if they don't
| feel that their identity is affirmed. The person who was
| banned says it was a force of habit because the person had
| only very recently went public with their new identity, but
| it did not stop the ban hammer. He _almost_ lost his job as
| well because the tweet was posted in a public slack channel.
| HR left it up the person who was dead-named to make the
| decision, and they decided not to fire him.
|
| I could absolutely see a scenario like the above happening in
| gmail or some other Google product, and the decision-maker
| deciding that it was a threat to a person's life and should
| be disclosed. To some people that makes perfect sense. To
| others it does not. The point is simply that when we let
| humans make subjective decisions like that, we get all the
| downsides of human judgment.
| 1137 wrote:
| If anyone wants to get rid of their devices because of this I'll
| take them :D
| pmayrgundter wrote:
| Reminds me of the First Law of Robotics[1]
|
| "A robot may not injure a human being or, through inaction, allow
| a human being to come to harm."
|
| "A provider like Google may disclose information to law
| enforcement without a subpoena or a warrant 'if the provider, in
| good faith, believes that an emergency involving danger of death
| or serious physical injury to any person requires disclosure
| without delay of communications relating to the emergency,'"
|
| s/provider like Google/robot/
|
| s/may disclose.*if the provider/may act to protect any person if
| the robot/
|
| s/requires disclosure without delay of communications/requires
| action/
|
| "A robot may act to protect any person if the robot believes that
| an emergency involving danger of death or serious physical injury
| to any person requires action without delay of communications
| relating to the emergency,'"
|
| [1] https://en.wikipedia.org/wiki/Three_Laws_of_Robotics
| nottorp wrote:
| I'm sure Google's Machine Learning!(tm) will always make the
| right decision!
| endisneigh wrote:
| Use a camera that exposes an RTSP feed like Amcrest and buy Blue
| Iris. Backup whole disk with Backblaze.
|
| You're done, and you're welcome.
|
| If you're really lazy at least use something like Apple HomeKit
| with supported cameras.
| [deleted]
| entropie wrote:
| Any RTSP camera, frigate and homeassistant makes it pretty easy
| for tech related folks to have a state of the art surveillance
| system with every feature you could imagine.
| sam0x17 wrote:
| Glad I'm using Euphy and not Nest
| eschulz wrote:
| I'm not sure they're much better. From the SHARING PERSONAL
| INFORMATION WITH THIRD PARTIES section of their privacy page:
|
| We may share your personal information in the instances
| described below
|
| Third parties as required to (i) satisfy any applicable law,
| regulation, subpoena/court order, legal process or other
| government request, (ii) enforce our Terms of Use Agreement,
| including the investigation of potential violations thereof,
| (iii) investigate and defend ourselves against any third party
| claims or allegations, (iv) protect against harm to the rights,
| property or safety of Eufy, its users or the public as required
| or permitted by law and (v) detect, prevent or otherwise
| address criminal (including fraud or stalking), security or
| technical issues.
|
| If we should ever file for bankruptcy or engage in a business
| transition such as a merger with another company, or if we
| purchase, sell, or reorganize all or part of our business or
| assets, we may disclose your information, including personally
| identifiable information, to prospective or actual purchasers
| in connection with one of these transactions.
|
| We may also share information with others in an aggregated and
| anonymous form that does not reasonably identify you directly
| as an individual.
| trasz wrote:
| At least it's not owned by American company, so doesn't
| technically _have_ to give the government institutions
| whatever they ask for. It's Chinese though, and that's not
| much better.
| ArrayBoundCheck wrote:
| I don't 'hate' this but how the heck would google detect
| something like this? Why do I feel like they're analyzing every
| word we say in our homes and building some kind of profile (for
| ad reasons). On second thought I do hate this
| saos wrote:
| Wow. So Ring and Nest at it. What are some safe alternatives
| please?
| nickthegreek wrote:
| Logitech
| Grazester wrote:
| Logitech does not have a competing system. They offer Cameras
| which tie in with Apple Homekit so the alternative here is
| Apple HomeKit.
| kevin_thibedeau wrote:
| Home Assistant
| tantalor wrote:
| > prevent someone from dying or from suffering serious physical
| harm
|
| These are exigent circumstances, which allow police to enter your
| home without a warrant. Extending that access to include video
| cameras seems reasonable.
|
| This is a "In Case of Emergency Break Glass" type of situation.
| anfogoat wrote:
| > _This is a "In Case of Emergency Break Glass" type of
| situation_
|
| The only difference being that when you pull the alarm for lols
| there are consequences -- for you. Members of law enforcement
| on the other hand do that on the regular and face zero
| consequences.
|
| That said, Google doing this might ease the pressure to enable
| this across the board through legislation. So maybe a positive
| in the end; I wouldn't really expect any privacy with Nest
| anyways.
| vorpalhex wrote:
| If police attempt to enter your home under exigent
| circumstances, you know.
|
| You can get a lawyer, go to court, and demand an accounting.
|
| You don't have that control here. Your footage can just walk
| off, no clue. No accounting, no reporting, no post-hoc
| punishment for mis-use.
| formerkrogemp wrote:
| > go to court, and demand an accounting.
|
| > You don't have that control here. Your footage can just
| walk off, no clue. No accounting, no reporting, no post-hoc
| punishment for mis-use.
|
| The supreme court recently ruled that a woman beaten by
| police didn't have standing to sue. Good luck with breaking
| qualified immunity in a court of law in the coming years.
| vorpalhex wrote:
| I literally can't find the case you are referencing. The
| only one coming back is a ruling on Marion's single-
| container 4th amendment statue which sounds unrelated.
| giraffe_lady wrote:
| It's literally a checkbox a cop clicks when they request data
| with no vetting that it actually is an emergency or
| consequences if it is not.
| vorpalhex wrote:
| Oh, have you seen the forms the cops fill out to do the data
| request? Or are you theorizing?
| giraffe_lady wrote:
| A close family member works in law enforcement, I have seen
| the form. Though for facebook not google. They could have
| changed it or google's could be different but I have no
| particular reason to think it's _materially_ different. It
| 's up to the cop to decide what constitutes an emergency
| and there is no verification or consequences beyond that.
| It's literally a checkbox.
| _jal wrote:
| It isn't quite a check-box, but here's Ring's exigent
| circumstances form:
|
| https://support.ring.com/hc/en-
| us/article_attachments/360053...
| vorpalhex wrote:
| Thank you, this is actually what I was hoping for.
| no_time wrote:
| title: fdsafdsafdsa fdsafdsa jklfdsa fuioxc cnmkplv
| auiopcm vmckxl; si9f op fdsafdsafdsa fdsafdsa jklfdsa
| fuioxc cnmkplv auiopcm vmckx
|
| looks like someone just mashed the keyboard after a long
| day of work.
| vorpalhex wrote:
| Your pdf client may be buggy or not like this file. I'm
| not seeing that same artifact.
| patmcc wrote:
| It's literally right here: https://ler.amazon.com/us
|
| You can see it. No theorizing. Click the big red "Submit
| Emergency Request" button.
| AlexandrB wrote:
| It's up to the police to show they have probable cause
| _before_ they collect data. That 's what warrants and
| judges are for. Anything collected without a warrant should
| be de-facto considered suspect. Why would you give agents
| of the government the benefit of the doubt here?
| vorpalhex wrote:
| Exigent circumstances is a real thing and goes back a
| long time. If a cop sees someone being murdered through a
| window, we obviously don't want them booking a call with
| the DA and local judge before helping.
|
| That being said, that power is obviously easily misused
| without checks and limits.
| NoGravitas wrote:
| If a cop sees someone being murdered through a window,
| they don't need the Nest camera footage of it.
| MichaelCollins wrote:
| Heh, exigent circumstances? If somebody is hurt or calling for
| help, by all means knock down a door or bust a window. But
| exigent circumstances for _rifling through my documents?_ Get a
| fucking warrant!
| alistairSH wrote:
| My problem with this is defining exigent circumstances as it
| relates to video or similar data. I can't easily think of a
| circumstance where the police are outside my house, encounter
| an exigent circumstance, but also have the time to
| request/obtain data from Google. And if the police are made
| aware of to something in advance, they have time to get a
| warrant.
| FollowingTheDao wrote:
| Didn't you know? Future Crimes are now exigent circumstances!
| mattnewton wrote:
| > This is a "In Case of Emergency Break Glass" type of
| situation.
|
| The problem is there isn't any broken glass here. No judge is
| stamping this, and I see no duty to warn the customer about
| what device and data was accessed, and there doesn't seem to be
| a way to opt out ahead of time. This removes a lot of checks in
| the name of expediency.
| causi wrote:
| If it's reasonable, then there shouldn't be a problem letting
| users turn it off.
| bushbaba wrote:
| How do you define this though. If someone say talks about XYZ
| politician. And the current governor feels if said XYZ person
| takes his position that it'd cause you personal harm. Can the
| current governor access your camera?
|
| If a police officer comes to your house without a warrant are
| they allowed to break in? If not they shouldn't be given access
| to my camera feeds without my permission.
|
| The move by google here just re-affirmed that I made the right
| choice to not buy their products.
| TedDoesntTalk wrote:
| > If a police officer comes to your house without a warrant
| are they allowed to break in?
|
| Absolutely. If they hear gunshots or screams of "help! Help!"
| they will break the door down. I'm sure there are other
| triggers, too.
| tantalor wrote:
| > circumstances that would cause a reasonable person to
| believe that entry (or other relevant prompt action) was
| necessary to prevent physical harm to the officers or other
| persons, the destruction of relevant evidence, the escape of
| the suspect, or some other consequence improperly frustrating
| legitimate law enforcement efforts
|
| https://www.law.cornell.edu/wex/exigent_circumstances
| DonnyV wrote:
| Yeah like police always follow the law.
| mattnewton wrote:
| I am not a lawyer. But, there seems to be a pretty clear
| difference here. If you have time to ask google you have
| time to call a judge. If you don't have time to call a
| judge, google doesn't have time to vet your decision and is
| effectively just streaming your cameras to your law
| enforcement office.
| tantalor wrote:
| The right time/place to argue this question is after the
| fact in front of a judge.
| jeffbee wrote:
| Yes, the only sensible comment is downvoted to the bottom of
| the page. In U.S. jurisprudence "exigent circumstances" gives
| police all kinds of powers including warrantless searches of
| all kinds, up to and including electronic surveillance and
| phone taps.
|
| The "process" to which you are "due" when it comes to these
| things is if the police try to use this information to
| prosecute you, they have to explain it in court, whereas if
| they obtain it through a search warrant, they have to explain
| themselves in advance. It's all "due process".
| mikece wrote:
| "Google will allow law enforcement to access data from its Nest
| products -- or theoretically any other data you store with Google
| -- without a warrant."
|
| At this point, what _doesn 't_ Google know or is unable to
| accurately infer about any given person on the internet?
| FollowingTheDao wrote:
| At this point I cannot believe that anyone in Tech would have a
| Google account being that they should know better. Anyone could
| be made to look guilty when the authorities have enough data.
|
| Adding this for the people who keep downvoting this:
|
| https://www.mintpressnews.com/national-security-search-engin...
|
| "Google - one of the largest and most influential organizations
| in the modern world - is filled with ex-CIA agents. Studying
| employment websites and databases, MintPress has ascertained
| that the Silicon Valley giant has recently hired dozens of
| professionals from the Central Intelligence Agency in recent
| years."
|
| We are imprisoned by our addiction to convenience. I am in the
| middle of dealing with this now, switching my services where I
| can: dumb phone, linux laptop, swisscows, openstreetmap and an
| old Garmin GPS, and just being OK with uncertainty.
| TedDoesntTalk wrote:
| > old Garmin GPS
|
| I want this. Which model did you get and does it have up-to-
| date maps?
| FollowingTheDao wrote:
| I have Garmin Drive 51. Yes, up to date maps. The only PITA
| is that you need windows or MacOS to update the system and
| the maps. I have a windows clone I put on my laptop every
| few months to accomplish this.
| orangepurple wrote:
| Just use Osmand on an old smartphone without a SIM. It's
| less worse than old Garmins and map updates are constant.
| Keep your system clock accurate for fast GPS lock.
| FollowingTheDao wrote:
| That is a great idea. I have a Pixel 4a with GrapheneOS.
| I am an idiot for not thinking about just not putting my
| SIM in the phone.
| orangepurple wrote:
| Phones without a SIM can also always call 911. Funnily
| enough Google Maps and Waze works without Google Play
| services. Just click past the modal that shows up very
| rarely complaining about it.
| freediver wrote:
| "Any" includes people who do not use Google and this is about
| 10% of online users. For them, a lot.
| hedora wrote:
| They still have facial recognition feeds, car feeds, and they
| purchase data from third-party consumer surveillance firms,
| including credit card purchase histories for non-Google
| users. You don't need to have a Google account to be tracked
| by AdWords, and there's no practical opt-out, other than
| staying logged in all the time, and then being tracked in
| other ways, and agreeing to all sorts of things in their
| (ever-changing) EULA.
|
| https://epic.org/documents/google-purchase-tracking/
|
| I'm sure they have other programs I haven't heard of.
| freediver wrote:
| Sure, but not using Google for search, and having an ad-
| blocking browser circumvents 99% of that threat.
| nebukadnet wrote:
| Is anyone really surprised? I am not. Google has such a bad
| relationship to data privacy, they don't see the problem. Getting
| a warrant is so easy in the US right now, so there is very little
| reason to disrespect your customers privacy. They just don't see
| the problem.
| ilrwbwrkhv wrote:
| Why do people still put a camera in their homes connected to the
| internet?
|
| That itself is the most bizarre behaviour.
| Grazester wrote:
| So I could see what the hell my nanny is up to with my
| children. I also don't want to have to spend 2 days setting up
| some hacked together system that I also have to spend time
| supporting when I simply don't have the time or will. That's
| why!
| boplicity wrote:
| Is there a decent "connected" camera that allows you to specify
| and use your own storage solution?
| LatteLazy wrote:
| In fairness, there is good reason to release footage without a
| warrant for the reasons they list (including "school shootings,
| kidnappings, suicide prevention, and missing person cases"). I'd
| like a robust monitoring of how the information is used. But the
| law already recognises that you don't need a warrant in urgent,
| life threatening cases. And let's get real, getting a warrant is
| so easy these days as to be meaningless...
| judge2020 wrote:
| > "If we reasonably believe that we can prevent someone from
| dying or from suffering serious physical harm, we may provide
| information to a government agency -- for example, in the case of
| bomb threats, school shootings, kidnappings, suicide prevention,
| and missing person cases," reads Google's TOS page on government
| requests for user information. "We still consider these requests
| in light of applicable laws and our policies."
|
| So this is just an article pointing out something in the TOS.
|
| National security requests are common for any big company[0-2]
| since they'd rather play ball today, under their own terms, than
| resist and trigger new legislation that forces them to hand over
| information in any warrantless circumstance.
|
| 0: https://www.apple.com/legal/transparency/us.html
|
| 1: https://transparencyreport.google.com/user-data/us-
| national-...
|
| 2: https://transparency.fb.com/data/government-data-requests/
| freedomben wrote:
| I think you're probably right in your analysis of their thought
| process, but I'd almost rather they force legislation's hand.
| Then there's a chance to hold elected officials accountable if
| they pass it, and there's a chance (IANAL) for a Supreme Court
| to throw it out as violation of the 4th amendment to the US
| Constitution.
|
| Even in the worst case (law passes, court upholds, and people
| re-elect the politicians) at least we have a definition
| somewhere on what is and what isn't ok. Today where it's
| ambiguous and "Company X may or may not" leaves a person to
| wonder.
| praxulus wrote:
| Maybe this would just make me a bad service admin, but if I
| believed that I would be literally saving someone's life by
| sharing data from one of my users with law enforcement, I would
| be hard pressed to hold their privacy above another's life.
|
| Exactly what it takes to be reasonably convinced that I would
| be saving someone's life is of course not a simple question,
| but in at least some situations it seems like it would be worth
| the tradeoff.
| kayodelycaon wrote:
| One that I've run into several times is suicide. Most of the
| time, the person isn't in immediate danger. I've been around
| when people were.
|
| I'm not going to go into any detail for people's privacy, but
| it took over an hour to figure out who a regular member of a
| community was in real life and then get emergency services in
| another country to them. Got independent confirmation they
| got dragged to a hospital in time. They weren't exactly
| grateful afterwards, but they stayed in the community and
| didn't repeat the attempt.
| ev1 wrote:
| Faked requests for situations like this are routinely and
| frequently used for harassment/attempted murder in gaming.
| Unless this is solvable, this is rather problematic and
| dangerous.
| kayodelycaon wrote:
| Maybe I should have clarified, I don't act on the vast
| majority of things in this manner, especially when I
| don't know the person or the circumstances. Unless I have
| some method of confirmation, escalating to emergency
| services is not happening. Even then, it's last resort
| for precisely this reason.
| aiisjustanif wrote:
| Why not end to end encryption so we don't need to even
| discuss this.
| [deleted]
| [deleted]
| bastawhiz wrote:
| The article implies that Apple makes a doorbell. Do they? There
| are obviously Homekit doorbells, but this sentence implies
| otherwise:
|
| > Apple's default setting for their doorbells is end-to-end
| encryption which means the company is unable to share user video
| at all.
| EricE wrote:
| Homekit doorbells use iCloud services and storage to host/serve
| the video; that's how Apple is involved even though they don't
| make the doorbells.
|
| I had a Nest at my previous house - decided to try the Wemo
| Homekit doorbell this time and I'm very pleased with it. I also
| use an Apple TV to drive all the content on my TV and the
| Homekit video notifications are amazing - instantaneous since
| the Apple TV is my local homekit hub. Way faster and far more
| reliable than the Nest doorbell ever was.
| hedora wrote:
| Infuriatingly, as a non-well-connected victim of a boring (not
| going to be good PR) crime, I've found it is completely
| impossible to get the police to pull data for my own yard,
| warrant or no.
|
| In my experience, these capabilities are only used to protect the
| politically connected or for oppression.
|
| I'd be less skeptical if the program allowed any homeowner to
| pull any nest footage that included their property, and also
| allowed individuals to pull any footage that included them.
|
| Of course, that will never happen.
| buzer wrote:
| > I'd be less skeptical if the program allowed any homeowner to
| pull any nest footage that included their property, and also
| allowed individuals to pull any footage that included them.
|
| I don't know how exactly it's in the US, but in EU security cam
| footage is considered to be personal information and as result
| of that GDPR applies. So whoever controls the security camera
| is the data controller and is required to e.g. provide access
| to people who were recorded. In this case it would likely be
| the home owner and unless Google was acting as data processor
| for the home owner they wouldn't be allowed to process that
| data.
| daniel-cussen wrote:
| reincarnate0x14 wrote:
| I gave the police a 4k camera video of a crime happening and
| they did nothing with it. This is pretty standard police
| laziness in the US, individual cops might care but departments
| overall give zero shits for small people except as revenue
| sources.
|
| Data ownership issues like this in the US will never be pushed
| by existing political interests, hopefully we'll see more EFF-
| ish PACs in the future that take a page from the fascists and
| provide the written laws and bills they would like enacted to
| state and municipal governments so local representatives can
| edit the title block and get back to harassing their interns.
| icelancer wrote:
| Same. I even know who stole stuff from our place of business
| and found the eBay listing tied to the person who did it. And
| yet, nothing.
| tedivm wrote:
| When I was younger our house was robbed and some prescription
| drugs stolen. Not only did we know who did it, we had text
| messages where they admitted it and a witness who came
| forward. The police refused to do anything about it, and told
| us that if we wanted them to do anything at all we should
| consider voting for a different mayor.
| evancox100 wrote:
| Actually I think the police should take direction from a
| democratically elected mayor. The problem is that they are
| most often not politically accountable to anyone at all.
| Not sure about the particulars of your situation, but that
| is a very odd response.
| Buttons840 wrote:
| Because the mayor created policies that tied the police's
| hands? Or because they just wanted a different mayor ("vote
| that librul out or we ain't doing shit" type of thing)?
| epicide wrote:
| I read it as more "we won't do anything unless the mayor
| forces us and we know this one won't".
| jonathankoren wrote:
| Probably because they just want a different mayor. Mayors
| and prosecutors don't have the ability to tie cops'
| hands. Even council and plebiscite acts to direct the
| police to treat certain crimes as "lowest priority" are
| routinely ignored. Mix in decades of copaganda and
| qualified immunity, and you get American cops that aren't
| accountable to anyone.
| teachrdan wrote:
| If the police refuse to subpoena the security company, in the
| US you can get a civil attorney to file a "John Doe warrant"
| and use that lawsuit against that unknown defendant to subpoena
| whoever has the footage you need.
| kop316 wrote:
| Thank you for that info! I am in a similar situation and that
| was something that did not occur to me.
| Thorrez wrote:
| Are you saying police issued a warrant but a large tech company
| ignored the warrant?
| spullara wrote:
| He is saying that the police weren't interested investigating
| crimes.
| [deleted]
| Thorrez wrote:
| Hmm, I think you're right. I was confused by the phrase
| "warrant or no", thinking it meant hedora had tried to get
| data both with and without a warrant and neither worked.
| chabons wrote:
| OP is also raising the point that because only the police
| have the ability to retrieve Nest footage that pertains to
| them but is not owned by them, access to that data is gated
| by the police, who may not use that access evenly.
| aliqot wrote:
| Coming from a small-ish area, how does this happen? Do they
| say "no" and then offer you the door, or how does it
| normally go? I assume it could be a manpower issue, if it
| is a city with more pressing issues, or a city without a
| detective unit maybe, but outright saying no is hard to
| justify. I can't imagine a situation that would make it
| normal to just say no with a straight face.
| standardUser wrote:
| It's not a coincidence cops spend a large portion of
| their time busting consensual drug users and enforcing
| traffic laws (while armed to the teeth). That's where the
| money is. Helping find my stolen computer is way more
| difficult and provides the department with nothing of
| value.
| uoaei wrote:
| This is especially true in jurisdictions where private
| prisons operate. Very (too) often the people who own the
| prisons and profit off of them have close ties to people
| of authority and power in local government. If you are
| skeptical, consider: otherwise the prison contract would
| have gone to someone else with closer ties or more
| leverage.
|
| Corruption is rampant in the "developed" world but we
| just use different words for it.
| aliqot wrote:
| I had not considered this hypothesis. I think my area may
| be small enough where it's not just a crime against me,
| but a crime against the community. For what it's worth, I
| hope your situation gets better. That has to be
| unsettling.
| fknorangesite wrote:
| > your situation
|
| Unfortunately this situation is by no means unique.
|
| > That has to be unsettling.
|
| And now you know why there has been a growing movement
| predicated on the idea that the vast sums of money we
| spend on police may not be well-spent.
| jonathankoren wrote:
| Police are revenue generators. It's the same in both big
| cities and small towns.
|
| A town of 300 people I grew up next to, always had their
| one cop ticketing every car that didn't slow down from
| the posted 55 mph (but honestly, it was always faster) to
| 30 mph, or accelerated over 30 before they passed the
| city limits sign.
|
| The city government loved that guy, because he brought in
| thousands of dollars every week.
|
| When the NYPD threw a fit over de Blasio, they famously
| refused to arrest anyone unless "it was absolutely
| necessary". Why? The police union wanted to hurt NYC's[0]
| budget.[0] It's the same reason why civil forfeiture is
| so prevelant.
|
| [0]
| https://www.theatlantic.com/national/archive/2014/12/the-
| ben...
| heavyset_go wrote:
| They get more state funding, grants and incentives for
| focusing on things like drug crimes. Enforcement and
| investigation of crimes are completely at their
| discretion, so they choose to go after "sexier" and "fun"
| crimes and criminals.
| colinmhayes wrote:
| I live in Chicago. It's not uncommon for police to just
| not show up when you call in a non violent property
| crime. If that happens you can go to the station and fill
| out a report but they won't actually do any
| investigating. Sometimes they'll be honest and say this
| isn't going to go anywhere, but usually you'll just get
| ghosted.
| jlarocco wrote:
| Here in Boulder most property crimes are ignored by the
| police. They'll file a report to give to your insurance
| company, but they'll flat out tell you that they're not
| going to do anything.
|
| I think it's a few things contributing to it. First is
| not having enough man power. And also a lot of stuff
| isn't worth the expense of investigating. A stolen
| laptop, phone, or bike just isn't worth the cost of
| detectives hunting it down. $20k of police work to get a
| $1000 laptop doesn't make much sense.
| heavyset_go wrote:
| And yet they choose to go after people for $20 worth of
| weed, coke, MDMA, etc. It's not about cost, it's about
| police discretion when it comes to the crimes they choose
| to investigate or the laws they choose to enforce.
| [deleted]
| uoaei wrote:
| Putting that body in a jail cell or prison is worth much
| more than $20 to those who have influence over how police
| spend their time. That's the point.
| sgjohnson wrote:
| Because it's easy to prosecute. You got the suspect dead
| to rights, with $20 worth of whatever on their person.
|
| And it's not like those cases even go to trial. I don't
| have the exact data on hand, but I'd be seriously
| surprised if >90% of small-time drug crimes ended up with
| a plea deal that most of the time doesn't even involve
| pleading guilty to a felony.
| heavyset_go wrote:
| > _Because it's easy to prosecute. You got the suspect
| dead to rights, with $20 worth of whatever on their
| person._
|
| Police regularly spend countless man-hours investigating
| people for selling $20 worth of marijuana. Those are not
| open and shut cases, they have to spend time building
| cases, use confidential informants or undercover officers
| to gather damning evidence that won't be thrown out in
| court, etc.
| Innominate wrote:
| Drug cases mean seizing assets.
| heavyset_go wrote:
| They regularly go after teens and young people with no
| assets who are selling pot to their friends.
| usefulcat wrote:
| Manpower seems the most likely explanation. I have a
| friend who works in a bike shop. Recently, the shop was
| broken into and $20K+ worth of high end bikes were
| stolen, plus a fair amount of damage to the shop.
|
| The next day, they found someone offering to sell the
| same bikes online (in the same geographical area even).
| They gave this info to the police, basically "hey here is
| someone trying to sell known stolen property" and the
| police told them to try and set up a meeting with the
| sellers themselves! Called back a week later, still had
| not even assigned a detective.
|
| $20k is not pocket change, so it does make me wonder
| exactly what kinds of property crimes they do
| investigate, if any.
| hinkley wrote:
| I have heard that in some cases telling the insurance
| company that nothing is being done might get some
| movement, but I'm not really sure what they could do that
| you can't. Call the Chief of Police?
| zo1 wrote:
| Yeah calling him will fix it in no time. Assuming you
| "have his number".
| lovich wrote:
| > I assume it could be a manpower issue, if it is a city
| with more pressing issues, or a city without a detective
| unit maybe, but outright saying no is hard to justify.
|
| Why would they need to justify anything? They have no one
| to answer to. Look at groups like the Uvalde police who
| are internationally known fuckups at this point and
| they're still just throwing other people under the bus
| left and right in response. The police only do their job
| when they feel like it and that's usually never unless
| it's writing down that they went to guard a construction
| site for the overtime pay.
| copperbrick25 wrote:
| Since when does police issue warrants? A warrant is issued by
| a judge, unless it somehow works differently in the US. I
| assume police just didn't do anything with that warrant.
| tomschlick wrote:
| Police request warrants. Judges approve and sign said
| warrants and they then become legally executable.
| delecti wrote:
| I'd find this much more acceptable if the system told people when
| their data was provided to police without a warrant. There's less
| potential for abuse if they're transparent about it.
| ajross wrote:
| Then you might want to read the linked article and not the
| headline, because CNET asked that question and... got an
| affirmative response, that Google will try to notify users
| whose data has been shared.
|
| Really almost all the top comments on this topic are people
| overreacting to the headline without realizing that the
| circumstances involved are emergencies and not just routine LE
| requests. Clickbait has ruined everything.
| ocdtrekkie wrote:
| They will try but can be legally prohibited from notifying
| you. A server you control and have the keys to can be
| compelled by the police, but not without your knowledge.
|
| Every supposed attempt to protect you is really just an
| attempt to justify an inherently unethical business model
| because it is profitable.
| TheSpiceIsLife wrote:
| _Try_.
|
| They'll _try_ to notify users? _After_ their data has been
| shared with police.
|
| Well that's reassuring.
| shadowgovt wrote:
| Consider the circumstances. The context in which this data
| is shared is imminent danger to life. That includes
| kidnapping.
|
| If Google had a magic system for getting information to a
| kidnapping victim reliably, they wouldn't need to share
| user's private data with law enforcement, they could just
| share the information on where to reach the kidnap victim.
| TheSpiceIsLife wrote:
| You don't fool me.
|
| What this will _predominantly_ be used for is abusive
| partners, who also happen to be law enforcement officers
| (police) to terrorise their victims, or, more generally,
| abuse of powers by LE.
|
| Here in Australia the police already have 30% more
| domestic violence charges than the average, and have been
| caught way too many times abusing their powers as LE.
| trasz wrote:
| >The context in which this data is shared is imminent
| danger to life.
|
| Not at all - it's merely somebody's _claim_ that there's
| some danger.
| shadowgovt wrote:
| A subpoena is based on merely somebody's claim that
| there's reasonable suspicion of a crime. All this policy
| by Google does is shift the set of people who are making
| the decision of what "reasonable suspicion" looks like;
| one's reaction to that depends on one's relative threat
| models of judges (and time delay of interacting with
| judges) vs. Google employees.
|
| FWIW, anyone still doing business with Google probably
| has a relatively high trust of Google, so that comparison
| is probably closer to equivalent than many might imagine
| for Google's users.
| turdit wrote:
| in 1984, it's an emergency if you wrongthink
| yardie wrote:
| > the circumstances involved are emergencies and not just
| routine LE requests.
|
| Fine, every routine LE request is now an emergency. Because
| one thing US LEOs are known for is their reserve in only
| requesting just the absolute minimum to do an investigation.
| egberts1 wrote:
| It is far, far better to set up your own personal cloud and run
| your own encrypted video capture system.
|
| You get both security AND privacy.
| shadowgovt wrote:
| And all it costs is the anxiety of wondering whether you
| succeeded or your bedroom is currently visible via shodan.io.
| GiorgioG wrote:
| So glad I ditched Gmail several months ago. I have a single
| Google Home device that was gifted to me by a friend, but I'm
| tempted to just unplug it at this point.
| Jim_Heckler wrote:
| What are you using now? been aching to get off gmail for a
| while, just haven't found the right alternative that supports
| IMAP.
| GiorgioG wrote:
| I switched to Fastmail and haven't looked back. Love it. IMAP
| is supported: https://www.fastmail.help/hc/en-
| us/articles/1500000279921-IM...
| TheSpiceIsLife wrote:
| Not the person you replied to, buy Fastmail does IMAP just
| fine.
| beej71 wrote:
| If you have a hosted website, check your provider. I have my
| domain at Dreamhost and they have IMAP included at my tier.
|
| I moved off Gmail to them about 9 months ago and it's be
| good.
|
| The worst part is that Dreamhost's spam filtering is abysmal
| compared to Google's. But my desktop is always on and I just
| keep Thunderbird running on it. TB's spam filtering is pretty
| good after training.
|
| I figure some of the dedicated email hosting services do a
| better job with spam.
| derwiki wrote:
| Fastmail, Protonmail
| NaturalPhallacy wrote:
| Wildly fascist. The warning sign was when AT&T got retroactive
| immunity for sharing customer data with the NSA. I wonder when
| Americans will decide to push the reset button.
| silentsea90 wrote:
| Why isn't end to end encryption a standard now?
| pseudalopex wrote:
| Pressure from governments reportedly.[1]
|
| [1] https://www.reuters.com/article/us-apple-fbi-icloud-
| exclusiv...
| daveoc64 wrote:
| Because it's impractical in a lot of cases.
|
| Most consumers don't want to (or can't) deal with managing
| encryption keys, and aren't going to be keen on the idea that
| if you lose the key, you lose the data permanently.
| TedDoesntTalk wrote:
| > Apple's default setting for their doorbells is end-to-end
| encryption which means the company is unable to share user video
| at all
|
| Why won't Google Nest and Amazon Ring do this?
|
| I have 6 Nest cameras throughout my home ( not the doorbell
| product)
| EUROCARE wrote:
| > which means the company is unable to share user video at all
|
| You said it yourself... How would you analyze the data if it's
| encrypted?
| nkozyra wrote:
| On the device itself, perhaps. Video data stored encrypted,
| metadata not.
| ajross wrote:
| Because Apple's HomeKit stuff is significantly inferior. HN
| privacy nattering aside, consumers clearly _want_ cloud-
| accessible home surveillance. It 's what they pay for. And if
| you can get to it from that website using only your Amazon or
| Google account, so can Amazon or Google.
| TedDoesntTalk wrote:
| How do you access videos from the Apple HomeKit products?
| solardev wrote:
| Siri will narrate it frame by frame.
| lotsofpulp wrote:
| In the Home app, on iOS or macOS, or in control center -->
| HomeKit on Apple TV.
|
| https://support.apple.com/guide/icloud/set-up-homekit-
| secure...
| endemic wrote:
| I was actually curious about this. I didn't realize that
| HomeKit-compatible doorbells existed. The product page for
| the Logitech Circle View Wired Doorbell (the least
| expensive listed on apple.com) notes "Receive rich
| notifications with two-way audio across all your Apple
| devices whenever a person or package is at the door." So
| maybe access via PC/web is what's missing?
| [deleted]
| oakesm9 wrote:
| HomeKit secure video stores an encrypted video in your
| iCloud account so you can access it from anywhere. It does
| require a paid subscription and a hub (HomePod, Apple TV,
| or iPad) where the hub does the video analysis on device.
|
| https://support.apple.com/en-
| gb/guide/icloud/mme054c72692/1....
| kevincox wrote:
| You can still store the data on the cloud and access it
| anywhere. I think the only features that probably need to be
| limited are video-analysis features such as who is at your
| door and what is moving across the lawn.
| nerdjon wrote:
| I have a hub (multiple actually thanks to a few HomePods and
| Apple TV's) for HomeKit. I can control all of my HomeKit
| devices remotely and I can view any video remotely.
|
| Yet it is still end to end encrypted because the processing
| happens locally on device (on the hub) instead of on a cloud
| server.
|
| I have the ability to detect people, animals, vehicles, and
| packages and set recording settings based on that. In
| addition to if I am home or not based on the existing
| detection round that built into HomeKit.
|
| This narrative around Apple's products missing major features
| is no longer true and has not been for a while.
| seanmcdirmid wrote:
| How good is the recognition? One of the reasons I went with
| nest rather than ring was because the ML on the latter just
| wasn't very good.
| nerdjon wrote:
| I have never used the package or the vehicle recognition.
|
| But it seems to work decent enough. However the face
| recognition works off of the faces you have categorized
| in photos. So actually recognizing specific people will
| only be as good as that database is. I assume in the
| backend it is largely the same tech so you can get an
| idea of how well it will work based on that.
|
| I can't compare it to other offerings since I have never
| used them, but it does what I need it to do.
|
| I have had issues with false positives, but I also am
| using a not great quality camera with it so it is limited
| with that. I plan on getting a better camera but just
| have not had a need for it yet.
| mdeeks wrote:
| Is it possible to view video on the web? What about sharing
| videos or clips of videos? I assume you have to leave your
| Apple TV running 24/7 then?
|
| Also, can Android users use it?
| JumpCrisscross wrote:
| > _Why won't Google Nest and Amazon Ring do this?_
|
| Ring says it offers end-to-end encryption [1].
|
| [1] https://blog.ring.com/products-innovation/ring-announces-
| end...
| ceejayoz wrote:
| It breaks most of Ring's more compelling features, though.
| prepend wrote:
| I suspect because Google and Amazon want to use the data from
| the video feeds for their own purposes. I think using it for CV
| modeling and advertising can make them more money than not
| using it.
|
| Also, they can charge police for these data.
|
| I won't buy their products because of their decision to
| monetize over privacy. What I find interesting is that their
| products aren't cheaper than other privacy preserving products
| from Wyze and Apple so this extra revenue is pure profit or
| gravy. This isn't some essential need of their business model,
| they just don't respect user privacy enough to avoid this extra
| revenue.
| dymk wrote:
| It's probably because it's somewhat more complex to implement
| encrypted end of end video and storage, with a slightly more
| complex user experience. So they cut corners, knowing most
| people don't care about encryption.
| nicce wrote:
| Complexity is not an issue. It a very basic thing to do in
| these days.
|
| UX might be a bigger issue, as it is harder to view this
| data in any device and still keep keys unaccessible by
| Google. But still not too hard. People use Whatsapp web in
| daily basis.
|
| Main problem is the ML use for video feed to trigger some
| stuff etc.
| dymk wrote:
| Of course complexity is an issue. If users don't care
| about E2E encryption, why would Google spend more time
| implementing an E2E application?
| gh02t wrote:
| Has Wyze ever provided privacy guarantees on their cloud
| service? Their privacy policy says they will gladly share
| data.
|
| > In response to a request for information if we believe
| disclosure is in accordance with, or required by, any
| applicable law or legal process, including lawful requests by
| public authorities to meet national security or law
| enforcement requirements;
|
| IANAL but that sure sounds to me like they will provide data
| in cases where they aren't legally _required_ to (as in a
| warrant) as long as they believe it 's "in accordance with"
| the law. Which could mean almost anything other than a
| clearly illegal request.
|
| https://www.wyze.com/policies/privacy-policy#c
|
| I like Wyze and think they are acting in good faith, but I
| wouldn't assume they would say no to a request from
| government/police. They may offer local recording and RTSP,
| but AFAIK they are still cloud connected unless you otherwise
| block them or use alternate firmware and hence not private.
| NoGravitas wrote:
| According to TFA, Wyze and Anker will both hand over data
| in response to a warrant or a court order, not simply on
| request.
| seanmcdirmid wrote:
| Probably because of the ML processing (event recognition),
| which is the only good reason to subscribe to Nest/Google home
| in the first place. Otherwise, you just have streams of video
| without much context. If the analysis could be done on device,
| then that would be a different story.
|
| Unfortunately, either Ring or Nest is required in my
| neighborhood given our property crime issues.
| edm0nd wrote:
| It sucks with Ring if you enable e2e encryption, they tied it
| to the ability to have live notification and alerts so you lose
| that functionality.
| bushbaba wrote:
| How do you expect Ring to provide live notifications that are
| processed server side if you enable e2e encryption? If
| Ring.com can decrypt the data, they can always share the feed
| with 3rd parties.
|
| It's a sensible design choice given the current constraints
| of the hardware.
| kornhole wrote:
| Maybe this article hints at why.
| https://www.mintpressnews.com/national-security-search-engin...
| daveoc64 wrote:
| Ring does offer end-to-end encryption, but it breaks various
| features. You can see a list of what it breaks on their
| website:
|
| https://support.ring.com/hc/en-gb/articles/360054941511-Unde...
|
| Many of these features are key selling points, so most
| consumers aren't going to be interested in turning end-to-end
| encryption on.
|
| It also doesn't work on battery-powered devices, presumably
| because they offload as much work as possible to the server to
| save power.
| cronix wrote:
| Knowing the car your neighbor(s) drive and what times they come
| and go are very valuable marketing tools, to someone. Just like
| knowing what they search for and the content of their emails
| tells them other things about you, and your neighbor who didn't
| sign up for this service.
| cinntaile wrote:
| Maybe it makes it harder to analyze and or sell that data to
| third parties.
| nisegami wrote:
| Do Amazon and Google sell data to 3rd parties? I always
| figured they wouldn't because it's literally the core of
| their value.
| tomComb wrote:
| They don't. As you note, it wouldn't make any sense for
| them to do so.
|
| It's one of those perverse things that they are some of the
| few companies that actually don't sell user data, and yet
| they're the ones that are most frequently accused of it.
|
| I don't think it helps us in keeping big tech companies to
| account when we spread false information about them.
| egberts1 wrote:
| Yes. They do and often indirectly through 3rd parties. It
| is that obvious; It's pervasive.
| cmiles74 wrote:
| I'm not convinced that they are simply "giving" the data to
| law enforcement, it strikes me as much more likely that
| they are charging them.
| BurningFrog wrote:
| 11 times in a year across the US sounds quite reasonable though.
| zokier wrote:
| I have difficulty understanding Googs (or Amazons etc) motivation
| to do this. What do they have to gain by being excessively
| cooperative with police force, especially on a local level? I can
| understand that having good rapport with feds might have
| benefits, but this doesn't sound like that
| elil17 wrote:
| Just giving out the data with no due diligence saves billable
| hours from their lawyers.
| AlexandrB wrote:
| Amazon's motivation is quite clear - they want to reduce
| shrinkage by discouraging "porch pirates" from stealing Amazon
| deliveries. Not sure what Google's motivation is, but it might
| just come down to optics. Consider how Apple gets raked over
| the coals in the media by DAs when they refuse to unlock a
| suspects phone.
| entropie wrote:
| > Amazon's motivation is quite clear - they want to reduce
| shrinkage by discouraging "porch pirates" from stealing
| Amazon deliveries.
|
| You say they hand all the information to the police to
| prevent package stealing? Iam pretty sure thats not the
| reason at all, maybe a very very little part of it.
| fredgrott wrote:
| Simple, if they cooperate without being asked by warrant they
| might hope that their using the underlying data may be
| overlooked and not reviewed by say oh US Congress.
| yellow_lead wrote:
| > "If we reasonably believe that we can prevent someone from
| dying or from suffering serious physical harm, we may provide
| information to a government agency -- for example, in the case of
| bomb threats, school shootings, kidnappings, suicide prevention,
| and missing person cases," reads Google's TOS page on government
| requests for user information. "We still consider these requests
| in light of applicable laws and our policies."
|
| What's the penalty for police lying to Google?
| NoGravitas wrote:
| Do you really think someone would do that? Just go on the
| internet and tell lies?
| jaywalk wrote:
| Why, they would never! And if they did, surely there was a good
| reason! We have to trust the police and Google!
| wly_cdgr wrote:
| Some may be uncomfortable with a company that has access to the
| most private data of so many people being ready and willing to
| cooperate with the police state, but voicing your support for
| this policy is actually a great way to demonstrate your
| Googleyness during the interview process!
| alangibson wrote:
| For those of you old enough to answer: 30 years ago, did you
| foresee people spending significant amounts of money to install
| spy cameras in their homes and tracking devices in their pockets?
| LatteLazy wrote:
| Actually yes. 30 years ago I was 8 and the first home CCTV
| setups were just coming out. People loved that shit. But then
| Im a brit and we're all nosey neighbours...
| sterlind wrote:
| weren't the Brits using CCTV to catch kids skipping school a
| while back?
| LatteLazy wrote:
| One of the reasons we have school uniforms is to help
| police identify kids. That's why they're all weird colours
| with insignia etc.
|
| We have facial recognition stations too.
|
| :(
| humanistbot wrote:
| 30 years ago (which was the early 1990s, eek!), analog CCTV was
| all the rage. Costs for cameras and VCR units plummeted, tapes
| became cheap commodities, and multiplexing video became
| trivial. With time-lapse and multiplexing, you only needed one
| VCR for four cameras, which could record for 24-48h on a single
| tape. It became cheap enough for businesses, hotels, apartment
| complexes, and the like to install them everywhere. Private
| CCTV became a staple of police procedural shows like Law and
| Order (1st season was 1990), which made people expect to be
| able to "check the tapes" after an alleged crime.
|
| Edit: This same issue was also a problem in the 1990s with
| private CCTV! If a police officer or detective tells a business
| owner that a crime has been committed and there might be
| evidence on the tape, the owner doesn't have to show the police
| the tape without a warrant. But they usually did, because it
| looks suspicious if they don't.
| hedora wrote:
| Here's a quote from Max Headroom (of Pepsi commercial fame, for
| people old enough to remember Pepsi), from 1987:
|
| Edison Carter: Security Systems has its tendrils into every
| element of our society - the government, our homes, the police,
| the courts - I'm not gonna spike this story just because it
| deals with dollar amounts beyond your comprehension! It's too
| important!
|
| Murray: ...cerebral...
|
| Theora Jones: Murray, we're trying to play this takeover as a
| threat to our average viewer. Nobody knows who's doing it. I
| mean, we all deal with SS every day - what if some really
| dangerous people got control of it?
|
| Murray: Who do you think controls it now?
| 0xbadc0de5 wrote:
| Your daily reminder - if you don't host the data, someone else
| does. And their interests may or may not align with yours. And
| even if their interests align with yours today, that's no
| guarantee they will tomorrow. If you don't want audio, video, etc
| potentially shared with authorities, don't install cloud-enabled
| audio/video devices in your home.
| MBCook wrote:
| Apple stores HomeKit video in their cloud.
|
| It's worthless because it's end-to-end encrypted. They can hand
| over the data but no one can view it.
|
| There are safe ways of using the cloud.
| O__________O wrote:
| >> " During this process, the HomeKit data is encrypted using
| keys derived from the user's HomeKit identity and a random
| nonce and is handled as an opaque binary large object, or
| blob."
|
| If you don't control the keys, to me, that's not end-to-end
| encrypted.
|
| Source: https://support.apple.com/guide/security/data-
| security-sec49...
| bushbaba wrote:
| You do control the keys though, "Because it's encrypted
| using keys that are available only on the user's iOS,
| iPadOS, and macOS devices"
|
| "The authentication is based on Ed25519 public keys that
| are exchanged between the devices when a user is added to a
| home. After a new user is added to a home, all further
| communication is authenticated and encrypted using Station-
| to-Station protocol and per-session keys"
|
| "The user who initially created the home in HomeKit or
| another user with editing permissions can add new users.
| The owner's device configures the accessories with the
| public key of the new user so that the accessory can
| authenticate and accept commands from the new user. When a
| user with editing permissions adds a new user, the process
| is delegated to a home hub to complete the operation. "
|
| https://support.apple.com/guide/security/data-security-
| sec49...
| O__________O wrote:
| By control, I mean, create, replace, destroy, etc -- I
| would never create keys based on "identity and a random
| nonce" selected by a third-party.
|
| Also, since you brought it up, appears Ed25519
| vulnerability has been reported:
|
| https://www.google.com/search?q=Ed25519+exploit
|
| Also, are these HomeKit "keys" in iCloud backups
| unencrypted? Meaning that the HomeKit data is encrypted,
| but the keys are not; to be clear, not saying they are,
| asking if they are.
| AndrewUnmuted wrote:
| If the backup password to these encrypted files is known, it
| can be rather trivial to access the data within.
|
| Recently, a certain head of state's son had 100s of GB of
| iCloud backups thrown onto a torrent, and within a day rogue
| manchildren living in their parents' basements cracked most,
| if not all of it open.
|
| With the backup password in hand, all one needs is this
| README.md file [0] to be off to the races.
|
| [0]
| https://github.com/avibrazil/iOSbackup/blob/master/README.md
| O__________O wrote:
| Some parts of iCloud are encrypted, some are not.
|
| Please stop posting this same topic in your comments.
| multjoy wrote:
| >If the backup password to these encrypted files is known,
| it can be rather trivial to access the data within.
|
| That's how encryption works...
| fezfight wrote:
| Can you verify that or are you just taking their word?
| Ajedi32 wrote:
| The client side software can easily verify that. Whether
| you can trust the software running on your device is a
| somewhat different question that has nothing to do with
| whether or not you're storing things in "the cloud" (though
| it is still a valid concern).
| dymk wrote:
| I like my Eufy doorbell and cameras, which stores recordings on
| an appliance that I keep in my office.
| MichaelCollins wrote:
| "Cloud" means somebody else's computer. This is pithy, people
| here are sick of hearing it, _but it 's still true._
| kornhole wrote:
| It could be your private cloud therefore your computer. <5%
| of people probably have their own server. When that % changes
| significantly, we may need to stop saying this.
| rolph wrote:
| we could do that back in 1992
| shadowgovt wrote:
| It's going to be an "if," not a "when." The percentage of
| users who want to manage their own cloud is vanishingly
| small. If people are sensitive to the risk of handing over
| their data to a trusted (well, trusted enough) corporation
| with a reputation to lose and money on the line, how safe
| should they feel putting their data on a cloud they manage,
| essentially stacking themselves up against every Joe Random
| Hacker on the Internet without the benefit of a Google,
| Microsoft, or Amazon SRE team to keep the shields up and
| the lights on 24/7?
|
| The risk surface for self-hosting is higher, on multiple
| axes, than cloud-hosting. Hosting your own cloud is the "I
| don't trust auto mechanics so I'm going to become an auto
| mechanic" approach and most people have neither the time
| nor the talent.
| 0xbadc0de5 wrote:
| I think this is missing the forest for the trees. There
| are plenty of products that make self-hosting seamless
| and transparent. Take the Ubiquiti Protect series of home
| video cameras as just one example of self-hosted plug-
| and-play.
| dafty4 wrote:
| Self-hosted VPN for at least a narrower risk surface, in
| theory?
| kornhole wrote:
| It is a 'when' not 'if' since it is becoming easier and
| easier to self-host, and more people are realizing the
| benefits. I did it this year and convinced many others to
| do the same or use mine.
|
| Hackers go after value targets which are companies or
| organizations holding a lot of valuable data. My little
| home server with social media, XMPP, Home Assistant and
| Nextcloud for pictures and such is not something they can
| do anything with. Good enough security is built into most
| self-hosting platforms.
| shadowgovt wrote:
| Hackers go after two kinds of targets: centralized high-
| value targets and distributed targets with common failure
| modes, via scripting. There's definitely risk in
| centralization, but there's risk in distribution as well:
| known exploits take forever to patch out of the
| distributed ecosystem. There's a reason Microsoft became
| so aggressive about patching Windows: without the
| aggression, people didn't put the effort in and internet-
| connected desktops became weaponized.
|
| I predict a correlation between small self-hosting
| projects and more entries on shodan.io. Your small home
| server is probably secure enough... Probably. But you're
| the sort of person that posts on a site called "hacker
| news..." How much should we trust the average soul to do
| the bare minimum to not get owned? Do we imagine they're
| checking in regularly on
| https://www.cvedetails.com/vulnerability-
| list/vendor_id-1723... ?
| kornhole wrote:
| Keeping things patched to latest was difficult in the
| past, but it is much easier these days. Unattended
| upgrades run every day on my server, and I get an email
| from my server that tells me any issues and if packages
| or applications have updates available. I open the web UI
| on my phone and make a couple clicks to update them all.
| The community at Yunohost is one of a few maintainers who
| have really simplified things.
| trasz wrote:
| >handing over their data to a trusted (well, trusted
| enough) corporation with a reputation to lose and money
| on the line
|
| Not sure how anyone who understands how Internet works
| could claim Google is in any way trusted, or trustworthy.
| Or has any reputation it could lose by a mere data leak.
| shadowgovt wrote:
| The last major data breach Google suffered was in 2009,
| via a targeted Chinese espionage program involving
| personnel intrusion.
|
| Contrast that with, well, _gestures widely to most of the
| Internet._ While Google 's security model is "We secure
| your data," so the trust interface between the end-user
| and Google is significant, Google for their part does an
| execllent job holding tightly to that data. If anything,
| the biggest risk working with Google is losing access to
| your own data because they stop believing you are _you,_
| not some third party getting your data out of Google 's
| clutches.
| trasz wrote:
| The last major data breach Google "suffered" is
| continuing as we speak, as mandated by US national
| security laws. "Most of the internet" looks better in
| that regard.
| shadowgovt wrote:
| US national security law is perfectly capable of applying
| the same subpoena pressure and sealed requests for data
| to any provider operating in the United States that it is
| to Google. If your argument is "Google is bad because it
| complies with the laws of the countries it operates in,"
| I have bad news about enforcement authority and
| governments in general.
| ericmcer wrote:
| I would guess less than 5% of people even know what cloud
| means. Even among devs its considered pretty hardcore to
| host things on your own hardware.
| kkielhofner wrote:
| I'm building an early stage startup where the application
| requirements are constant GPU usage, 30TB of VERY fast
| NVMe storage, 512GB of RAM, LOTS of bandwidth transfer,
| and roughly 130TB of local storage (dataset for
| training). There is no combination of abstracted API-
| driven cloud services that can be pieced together to do
| what we need to do.
|
| I didn't want to bother dealing with the advanced
| calculator but the AWS EC2 instance to handle this
| bespoke application would be $130k/yr with one year
| upfront and 12 month reservation:
|
| https://calculator.aws/#/estimate?id=b3de412ba6a7748e9acd
| 782...
|
| This doesn't include bandwidth (which AWS has an insane
| markup on) or nearly enough storage.
|
| Needless to say you can lease the hardware from $VENDOR
| and host it for years with an all you can eat 10gig port.
| Oh and the cost is absolutely fixed - no worries about
| getting that shocking AWS bill we all know of.
|
| With a lease and hosting it's still an operating expense
| and with Section 179 it's actually preferable from a tax
| standpoint.
|
| With datacenter remote hands and vendor support you're
| still not dealing with hardware - you don't have to even
| step foot in the hosting facility. Ever. If a hardware
| component fails the vendor dispatches someone and it gets
| handled (same day). That said hardware failures are (in
| my experience) exceedingly rare until you get to some
| significant scale where it's a numbers game.
|
| Because ec2 is OS up anyway from a dev ops standpoint
| it's pretty much identical.
|
| Let's say you keep it for three years and upgrade. The
| old hardware is still worth something so you can
| repurpose, sell, trade, whatever.
|
| Rough math but let's call this approach 1/4 the cost (but
| likely much less).
|
| At AWS markup you can hire a dedicated full time person
| to manage just one server and still come out ahead. Want
| five of them around the world? Now you're saving a TON of
| money vs different AWS regions.
|
| Even with this when I tell people we deployed our own
| hardware they look at me like I'm crazy. We have an
| entire generation in tech from jr devs to the C suite to
| investors that are terrified of hardware and will pay
| anything to avoid it.
|
| Because of this they don't even know what hardware,
| bandwidth, etc actually costs. It's assumed the cost is
| whatever $BIGCLOUD offers it at. I know this is an
| extreme case but it probably happens more than most would
| think.
| cruano wrote:
| I guess 0.00001% is still < 5%
|
| Most people _in tech_ still use @gmail addresses, do you
| think regular people bother to have their own private cloud
| for data storage ?
| apocalyptic0n3 wrote:
| There's no way it's 5%, but Gmail addresses are a pretty
| poor indicator to use. Hosting your own email is one of
| the most difficult sysadmin tasks out there because
| keeping your server off the blocklists is not trivial;
| it's becoming more and more common for blocklists to just
| block entire CIDR ranges if a single IP in it gets
| blocked. I don't fault anyone for not going down that
| rabbit hole.
|
| Granted, they could be using Proton or some other
| alternative. But you still have the same inherent problem
| that someone else hosts/owns your email data.
| jlund-molfese wrote:
| Yeah, the easiest way to go is to have a bigger name like
| Apple or Migadu handle the actual mail server stuff, but
| use your own custom domain
|
| Of course that doesn't help when you've had a Gmail
| address for 15 years and can't get rid of it
| LinuxBender wrote:
| I think server/app deployments would need to be just a
| little more happy-clicky and lower friction and less risk
| before more non techies would venture into that space.
| All the "best practices" a sysadmin performs would have
| to be baked into it or at least be checkboxes that enable
| them _at a cost_ with simple explanations of each
| functions cost /benefit in video format. Some VPS
| providers are slowly going in this direction but have
| quite a ways to go in my opinion. They have lowered the
| bar for new techies at least.
| HaZeust wrote:
| "Someone else's computer" would be misinformation in any
| other medium SOLELY because it makes it sound like it's being
| stored on the same kind of computer that an average person
| has - that another average person is using. There was a smear
| campaign in 2013 for cloud computing under this very message.
| sandworm101 wrote:
| Incorrect title. Not the nest, Google here is allowing thier
| devce. Google is the one handing over data without a warrant.
| throw7 wrote:
| How easy would it be for google to, you know, program an
| interface that allows the police request to be shown to the owner
| and have them say yes or no to the video wanted?
| qwertox wrote:
| This just left me with my head shaking in disappointment after
| just reading the title.
|
| I'm done with all these devices, yet I know that my phone will
| still have this ability even if I've disabled the hotword
| functionality.
|
| Luckily for us, there are still the single board computers with
| cameras attached to them, which can do the same job without the
| could.
| jesuspiece wrote:
| Was never your data to begin with
| smrtinsert wrote:
| The market seems to be wide open for a range of home devices that
| don't violate your privacy.
| SN76477 wrote:
| Police should not be able to ask without a warrant.
|
| down with surveillance capitalism!!
| lettergram wrote:
| I can't seem to find the direct link, but I remember when the
| state of Illinois created a program to give these out for free
|
| Recall at the time I mentioned to friends... I bet police get
| direct access and within 5 years, here we are lol
|
| Many utility companies still do https://themoneyninja.com/free-
| google-nest-thermostat/
| tdub311 wrote:
| Do the thermostats have cameras in them?
| elil17 wrote:
| They have occupancy sensors (not cameras), but those can
| still be used for surveillance.
| entropie wrote:
| Iam not sure. That means that it has to log every triggered
| sensor.
|
| I only have nest protect (stove heating) with occupancy
| sensor/pathlight and I really dont think it will record
| motion events nor submit them later (while connected).
| Could be wrong.
| elil17 wrote:
| For nest thermostats, they advertise that their cloud
| based algorithm uses your phone location and occupancy
| sensor data to switch between "home" and "away" set
| points, so they have to be sending the sensor readings to
| the cloud. From there I can't understand why they
| wouldn't log those.
| hedora wrote:
| The occupancy sensors include wifi and bluetooth.
| [deleted]
| core-utility wrote:
| I have no direct experience, but every time I see Illinois
| mentioned on commentary (HN, reddit, etc.) it's always
| surrounding corruption.
| solardev wrote:
| FWIW, Illinois sometimes does crazy things the other way too.
| They banned facial recognition altogether, so Nest (and other
| cameras) can't offer the "familiar faces" feature, like to
| not alert you when a spouse comes home instead of a stranger.
|
| Facebook also paid me several hundred dollars because they
| were auto tagging faces for Illinois users.
| ocdtrekkie wrote:
| Illinois actually has some of the strongest privacy
| legislation in the country. I got a $350 check from Facebook
| for running facial recognition on my photo without
| permission, and Google owes me a similar amount when it's
| class action gets approved.
|
| Check out the Illinois BIPA, it's the reason tech companies
| are trying to get federal privacy legislation that preempts
| state law.
| 999900000999 wrote:
| You haven't seen my comments praising Chicago ?
|
| The friendliest city I've ever lived, amazing public transit
| and where I met my first real girlfriend.
| sys_64738 wrote:
| It should never be possible for a cloud provider to decrypt data
| from your devices. It should be encrypted by a public key you
| load on the device and only decrypted by your private key. The
| cloud prouder should be a dumb provider of the service and not
| get in the way. As soon as the cloud provider gives your
| decrypted data away to random people then that is a massive
| security hole.
| hackererror404 wrote:
| I would happily pay more for a company that will refuse to do
| this. This will be a strong market opportunity, at least for this
| buyer.
___________________________________________________________________
(page generated 2022-07-27 23:01 UTC)