[HN Gopher] Amazon-powered AI cameras used to detect emotions of...
___________________________________________________________________
Amazon-powered AI cameras used to detect emotions of unwitting
train passengers
Author : amunozo
Score : 61 points
Date : 2024-06-17 19:14 UTC (3 hours ago)
(HTM) web link (www.wired.com)
(TXT) w3m dump (www.wired.com)
| surfingdino wrote:
| I wonder if these systems are in operation at Canary Wharf too,
| because you are not allowed to wear face masks there already.
| orthecreedence wrote:
| "Megacorp dupes government into buying its spy hardware to expand
| surveillance state because of bikes or something."
| bradjohnson wrote:
| >It was established that, whilst analytics could not
| confidently detect a theft, but they could detect a person with
| a bike
|
| Really groundbreaking stuff. I'm so happy public funds are
| being used to pump surveillance data to a mega-corporation so
| they can tell us who does or does not have a bike.
| gnarlouse wrote:
| Ah yes, bike, no bike
| red_admiral wrote:
| It seems like there actually was some value to this part:
|
| > allowed police to speed up investigations into bike thefts
| by being able to pinpoint bikes in the footage
|
| Normal procedure as far as I know is you keep recordings for
| 48h or whatever, and if someone reports their bike as stolen
| a human reviews the footage to see if they can find anything.
| If you can use ML or something to tag the snippets in the
| last 24h that contain bikes being moved, that's more
| efficient use of taxpayer-funded police officers' time?
| bradjohnson wrote:
| It's also something that somebody with little to no
| previous knowledge of machine learning could whip up with
| OpenCV in a weekend. Maybe they could hire an undergrad to
| do it as a student project instead of giving Amazon
| unfettered access to their surveillance data?
| Am4TIfIsER0ppos wrote:
| The government doesn't need to be duped into expanding
| surveillance. They are desperate for more of it.
| orthecreedence wrote:
| Right, the duping comes from getting them to install adtech
| instead of just regular old cameras.
| isaacremuant wrote:
| Extremely expensive yet unreliable trains... Subsidized.
| Extremely high taxes.... And this is what is used for: Squeezing
| more from people and abusing their identities further while
| having very little transparency on the government dealings where
| it matters.
|
| Bonus points for trying to "meddle" in Macedonia CIA style.
|
| But hey, doing useful things like closing all those loopholes
| that allow elites to use tax havens while operating in London or
| punishing corruption is not really on the list.
| throwaway48476 wrote:
| Why is it even important to know the emotional state of
| passengers? Assuming it's even accurate.
| mattlondon wrote:
| Article covers that: upsells in retail and advertising
| Casteil wrote:
| Sentiment analysis can be pretty valuable in
| marketing/propaganda.
| astromaniak wrote:
| That's only official part. Having cameras you can track a lot.
| 1. recognize people 2. track who is interacting with whom 3.
| track who personally pays attention to what ads on the screen
| 4. get some clues about health and wealth. personalized and
| average 5. if cameras come with microphone arrays, 'for gunshot
| detection', you can listen selectively. +recognition, remember?
| 6. Read smartphones' screens.
|
| Now all this in hands of private company. Ready for sale.
| bsenftner wrote:
| This is really it. Private companies are paying lobbyists to
| fill lawmakers nonsense fake facts selling imaginary public
| safety value, while a huge amount of marketing value will be
| reaped by the private company in the hands of this public
| data.
| ThinkBeat wrote:
| Super duper use case: Figure out what makes travelers the
| angriest. Make improvements. Measure how efficient it was.
|
| Less duper use case: Figure out the threshold of incompetence,
| delays, infrastructure management, overcrowding is likely to
| cause a riot and try to avoid riots.
| ryandrake wrote:
| > Less duper use case: Figure out the threshold of
| incompetence, delays, infrastructure management, overcrowding
| is likely to cause a riot and try to avoid riots.
|
| This is what I was thinking. It's a product marketer's
| optimization dream: Find out the worst possible product you
| can make where customers will still (barely) buy it. If you
| can measure quantitatively how disgusted a customer is, and
| if you can quantify the level of disgust where he chooses not
| to buy, then you can find the exact cost floor for your
| product.
| s1gsegv wrote:
| The incentive systems need to be aligned for us to start
| seeing these super duper use cases
| w4der wrote:
| Another thing I figure they might be trying is anticipate
| suicides, so they can stop them before they happen. But it's
| probably just to sell you a grimace shake or whatever.
| jacobgorm wrote:
| It is not important, but "sentiment" was a pretty standard
| feature of early vision AI systems, so I guess that data was
| collected in the hope that it would perhaps be useful some time
| in the future. I used to sell customer journey tracking edge AI
| software, and though we never shipped sentiment in production
| it was a thing our customers would regularly ask for.
| bsenftner wrote:
| Such an ignorant waste of task payers money. Face expression is
| not emotion. Can we say the elementary school fact again: one's
| face expression is not their emotional state. People have
| memories that cause facial expressions, people play out scenarios
| in their imagination that cause facial expressions, and ordinary
| every day body pain and plain old fart suppression in public
| causes facial expressions. Are these everyday ordinary human
| behaviors going to require explaining to the authorities, or will
| people just start saying "no" to absolute nonsense? Seriously.
| This is "technology" absolute nonsense, absolute tax payer theft.
| tzs wrote:
| That's probably true for individuals, but what about for
| crowds?
|
| If say most people in a crowd are angry some might have non-
| angry facial expressions because they are thinking about a non-
| angry memory or they are busy trying to suppress a fart but
| wouldn't there still likely still likely be a much higher
| proportion of angry facial expressions than in a crown with a
| normal number of angry people?
| beardyw wrote:
| For those who are uncertain, I think the term "trespasser" is a
| euphemism for suicides. At a terminus like Waterloo the trains
| are approaching pretty slowly to stop just short of the buffers,
| so much less risk.
| red_admiral wrote:
| It sometimes is, but it also includes cable thieves, people
| trying to collect a ball they've lost (as the article
| mentions), anti-social behavior from the kind of people who
| think it's fun to set a signalling cabinet on fire, and
| potential terrorists/saboteurs.
| jerlam wrote:
| Do these stations not have platform screen doors? Those seem
| like the most obvious deterrent to suicides, as well as lost
| items and people generally getting onto the tracks.
| aliher1911 wrote:
| You have different rolling stock departing from the same
| platform. Trains have doors at different locations. Some
| trains have them close to the center, some near the edges.
| There are also trains with slam doors - ones that open
| manually outside, not slide sideways automatically. Welcome
| to national rail.
| flir wrote:
| > There are also trains with slam doors - ones that open
| manually outside, not slide sideways automatically.
|
| Haven't seen one at Waterloo for decades. Could be wrong
| of course. I miss them.
| lmm wrote:
| No. Waterloo is a huge station used by a huge variety of
| trains with doors in different places, shifting to platform
| screen doors would be the work of decades if it's even
| possible at all.
| 01HNNWZ0MV43FF wrote:
| Ironically mass surveillance is the only thing I might
| eventually kill myself over
| astromaniak wrote:
| > "use a range of advanced technologies across our stations to
| protect passengers"
|
| Total surveillance. Of course, that's for your own good. Do they
| guarantee that after 'smile detection' image is deleted. Or they
| keep images 'only for testing'?
| baxtr wrote:
| You seem not to love our Big Brother, do you?
| walterbell wrote:
| With falling costs for NPUs in SBCs and AI PCs, there will be
| ubiquitous human-activity-sensing cameras and WiFi radios in
| buildings and desks.
|
| https://www.pcmag.com/news/how-qualcomms-always-on-camera-be...
|
| _> The "Always-On Camera" option that debuted with last year's
| Snapdragon 8 Gen 1 is now the "Always-Sensing Camera." This
| feature keeps a phone's front camera on, in a low-power and low-
| resolution mode and available only to the chipset's Qualcomm
| Sensing Hub secure enclave, to do basic checks of presence. For
| example, blanking the display if a second face appears in the
| background in a possible case of shoulder surfing._
|
| https://www.pcgamer.com/intel-wifi-sensing-raptor-lake/
|
| _> 13th Gen Raptor Lake processors. The technology is called Wi-
| Fi Proximity Sensing (or just Wi-Fi Sensing), and since I 'm over
| at Intel's lab in Haifa, Israel, I've been able to give it a go
| for myself. It really is as simple as it sounds. You just walk up
| to the PC and it will wake from sleep to your desktop or lock
| screen. Then, if you walk away, it'll go back to sleep again 30
| seconds after you abandon it._
|
| Upcoming activity detections include breathing rate, which can be
| a proxy for emotion. Who will own/secure the data?
| sreejithr wrote:
| Amazon? Oh hell no! No way. I'll email all my personal data to
| NSA and anyone else who's interested. But not to Amazon.
| imchillyb wrote:
| Amazon already has your data. The data that matters to them
| anyway.
|
| Companies purchase bulk customer data. Your data and my data
| will be included in these purchases.
|
| Anything else is correlative and immediate ensuring that your
| data and my data is current.
|
| To pretend otherwise is naive or agenda serving.
| janalsncm wrote:
| > some say the technology should be banned due to the difficulty
| of working out how someone may be feeling from audio or video
|
| I see this red herring all the time in reporting. The problem
| isn't that it's unreliable. That will probably change in the
| future. The problem is, I don't want to be spied on. Period. The
| fact that it's inaccurate only adds insult to injury.
|
| Ubiquitous surveillance in Xinjiang didn't become more acceptable
| as their facial recognition tech got more accurate.
| verisimi wrote:
| https://web.archive.org/web/20240617202511/https://www.wired...
|
| Link to avoid the hundreds of essential 'partners processing
| data' via cookies.
| kjkjadksj wrote:
| All of these speculative use cases mentioned in the article and
| thread, while certainly possible using AI, were always possible
| before and sometimes done quite well too. So why weren't they
| used in excess before? A lack of a strong business proposition,
| its that simple. If you were able to get really clear, actionable
| information from these metrics, everyone would have been
| collecting them decades ago. Clearly, the business case is too
| insignificant to be worth while, but that's not going to stop
| companies from riding the AI wave and shilling junk as long as
| they can get in a room with a nontechnical corporate software
| buyer. There's always good money in selling the latest tool to
| hammer a nail like one always could have.
| Retric wrote:
| > were always possible before
|
| It's a question of viability not possibility here.
|
| Dropping costs by 95+% dramatically changes things. So, you can
| "make a strong business proposition" for stuff that was never
| going to happen with the old method. Consider the economics
| around scam phone calls when long distance phone calls cost
| more than 2$/minute. They still happened but it was vastly more
| targeted and therefore orders of magnitude less common.
|
| Now apply that same logic to facial recognition etc and
| suddenly new types of surveillance will become commonplace.
| verisimi wrote:
| > "We take the security of the rail network extremely seriously
| and use a range of advanced technologies across our stations to
| protect passengers, our colleagues, and the railway
| infrastructure from crime and other threats,"
|
| But I think Amazon and co are the threat, and you're giving them
| exclusive access to collect huge amounts of data!
___________________________________________________________________
(page generated 2024-06-17 23:01 UTC)