[HN Gopher] Walmart, Delta, and Starbucks are using AI to monito...
___________________________________________________________________
Walmart, Delta, and Starbucks are using AI to monitor employee
messages
Author : cebert
Score : 122 points
Date : 2024-02-10 14:06 UTC (8 hours ago)
(HTM) web link (www.cnbc.com)
(TXT) w3m dump (www.cnbc.com)
| voidhorse wrote:
| Do we just call all of statistics "AI" now? I fail to see how or
| why applying basic machine learning techniques for data analysis
| is "artificial intelligence".
|
| This idea that intellection is nothing more than pattern matching
| is silly and needs to die. _Conceptual understanding_ entails
| more than just pattern matching.
| csa wrote:
| AI is one of many headline buzzwords these days that get an
| outsized number of clicks relative to the content, right along
| with Tesla, Trump, Taylor Swift and probably a few others.
| freedomben wrote:
| Relevant XKCD: https://xkcd.com/435/
|
| Sentiment analysis may ultimately just be "statistics" but IMHO
| it goes way beyond what most people consider statistics, which
| is collecting, organizing, graphing data. Sentiment analysis
| draws conclusions from the data, assigning things like "level
| of negativity" and "level of positivity" among other things. It
| definitely feels like an artificial type of thing that humans
| do, so artificial intelligence seems like a reasonable category
| to me.
| vrc wrote:
| Historically ML is a sub field within the study of AI. So are a
| lot of even more mundane techniques. So, if the hit word to
| describe things is AI, and the shoe fits, wear it!
| electroly wrote:
| The AI part is the sentiment analysis of the chat message text
| and pictures. They're using statistics on the resulting
| aggregate data. This seems pretty reasonable to me. ML is
| roughly a gazillion times better than traditional NLP
| techniques for sentiment analysis. If you're doing sentiment
| analysis _without_ ML in 2024 then you need to stop and
| reevaluate what you 're doing.
|
| If you simply wanted to complain that they're using "AI" to
| describe something that uses ML, please tell me so I can move
| along. Nothing interesting there to discuss.
| jmknoll wrote:
| Did you mean to write LLMs instead of ML? Aren't traditional
| NLP techniques a subfield of ML?
| jeffnappi wrote:
| There's overlap, but many traditional NLP techniques are
| heuristics based. Here's an example:
| https://github.com/cjhutto/vaderSentiment
| horacemorace wrote:
| Do you call all of LLMs "statistics"? I fail to see how anyone
| who has studied and worked with these systems could fail to see
| how they exhibit behavior of conceptual understanding.
| delfinom wrote:
| It's a mandatory buzzword for investment in 2024 since
| everything else has been milked to death already. I've seen
| companies go back and slap "AI" on things that have been out
| for 15 years.
| sambull wrote:
| If you want to close your funding round.. yes
| jairuhme wrote:
| The headline is implying that each individual is being tracked as
| if for performance reasons, but its a bit different. And is it
| really that different than the data that companies already keep
| on individual messaging? The only thing really different from
| what I can see is that they now have a tool to try and understand
| the vast amounts of data vs. combing through message logs more
| manually.
| mistrial9 wrote:
| IIR one of a NeurIPS papers (or similar) was a Walmart Labs paper
| on "Purpose driven AI Guide for Employees" or similar. Multiple
| mentions of psychological health in the same paper that describes
| an AI assistant that is hypothetically required for every
| employee of a certain stripe, that does give instructions and
| answer questions.
|
| As a skeptic, the one-sided description of mental health, and the
| obvious parallel to a "always-on slave collar" jumped out. Is
| there legitimate use for this kind of employment technology?
|
| As is so common, those most incentivized to produce and deploy
| such a thing, are the exact character that is not to be trusted
| with such an invasive machine. It just screams of a need to
| regulate, to my western culture eye.
| dpflan wrote:
| Can you locate the paper you're alluding to?
| kj4211cash wrote:
| I work in Tech at Walmart and the funny thing is that of all
| the places I've worked they do the least in terms of surveying
| employee satisfaction. If they just asked, I and many of my
| colleagues would be happy to tell them which people and
| policies are toxic and which are wonderful. There's broad
| agreement in the rank and file on much of this. There are some
| truly awful and some truly wonderful things about the way
| Walmart manages tech. When I mentioned this to my VP, the
| response was basically that Walmart top level management didn't
| want to know some things because they might be compelled to
| act. Maybe all this "AI" rigamarole is just a way for Walmart
| top level management to get information without being legally
| or morally obligated to do anything?
| sandworm101 wrote:
| Ok, exactly which communications does this AI monitor? This seems
| to be internal corporate coms (corporate email, internal chats)
| which are not subject to privacy. But the article is very
| unclear, leaving the reader to assume that they are monitoring
| social media or something else external to the organization.
|
| Internal company communcation channels are open to inspection by
| the company. Whether that company wants to use AI or pay for
| agents to read ever internal email is up to them. Want your
| bosses not to hear you rant about a new corporate policy? Don't
| sent such comments via company email.
| brvsft wrote:
| > Depending on where you work, there's a significant chance
| that artificial intelligence is analyzing your messages on
| Slack, Microsoft Teams, Zoom and other popular apps.
|
| I don't think it's so unclear, and it sounds like you
| understand too, internal comms. I know it seems obvious, but a
| lot of people wouldn't even immediately guess. (My first guess
| before RTA was Slack or anything like Slack. I always think,
| when I use Slack, that it cannot be truly private unless I
| start manually encrypting my messages with coworkers.)
|
| I'd guess anything that the business has control over might be
| a candidate. Even easier if the platform allows for apps to be
| added on at the company level.
| sandworm101 wrote:
| I still call that unclear as many people use Slack and Teams
| for non-work stuff, or at multiple jobs. The tone of the
| article leaves a big open door by not directly stating that
| this is about monitoring employer-controlled channels rather
| than all slack or teams traffic.
| ipaddr wrote:
| Tell that to any subject the media deems newsworthy.
| Celebrities are mobbed with photographs or murderers/victims
| the same.
|
| If that's acceptable where does this fit in your worldview
| arrrg wrote:
| Any kind of communication surveillance that goes beyond
| monitoring for malware and data exhilaration, especially for
| monitoring performance, would be completely unthinkable at my
| (German) employer. The employee council would never ever agree
| to that.
|
| It's not at all obvious that the moral perspective you present
| as obvious actually is.
| analog31 wrote:
| I'll make sure to include at least one German colleague in
| all of my Teams chats. ;-)
| wolverine876 wrote:
| > Internal company communcation channels are open to inspection
| by the company.
|
| That is the current rule (at least in some places), but we're
| conflating the issue of 'is it legal?' with 'is it right?'.
|
| > Whether that company wants to use AI or pay for agents to
| read ever internal email is up to them.
|
| IMHO this argument needs to be buried once and for all,
| especially on HN. Automation by computer transforms the power
| of otherwise manual functions. For example, government could
| always surveil people on the street, but they had to do it
| manually. With computers, using cameras and facial recognition,
| they can automatically track the entire population
| (theoretically, and later if not now). It's not the same thing
| as manual surveillance of a suspect.
|
| > Want your bosses not to hear you rant about a new corporate
| policy? Don't sent such comments via company email.
|
| All that said, I think there are legitimate questions. The mass
| surveillance enables management to stay ahead of any employee
| organizing or other political (as in office politics) moves -
| management knows before the employees themselves realize that
| lots of people agree with each other. That includes
| unionization, for example.
|
| Yes, if the company allows it then you can communicate through
| outside channels, but can you reach the whole company there?
| That is certainly a hinderance to any mass activity.
| alwa wrote:
| There's also the Goodhart-flavored line of argument here: at
| the moment, part of the sales pitch is that these internal
| conversation channels provide a meaningful view into staff's
| reactions, dynamics, and states of mind.
|
| Fire the first few folks for their AI-detected "toxicity," and
| one imagines the wiser of their "toxic" friends will find other
| avenues for their bad behavior. Use sentiment analysis to sound
| out and reward the true believers in the corporate party line,
| and suddenly everyone will be Slacking effusive praise for the
| policies of the dear leaders.
|
| The usefulness of this tech would seem to depend a lot on how
| the place is managed. The obvious worry is that the powers that
| be will smoke out dissenters, reward toadyism, and punish
| harmless individual differences. But I can also imagine a
| constructive management environment, where the "mothership"
| worries that their feedback from the line staff is getting
| distorted through layers of management sucking-up and
| bureaucracy; and imagining this type of analysis as some kind
| of independent finger on the pulse of the workforce. That seems
| like it might be useful for making genuinely good decisions.
|
| Knowing a lot of true stuff and using it to rule out bad
| guesses we made based on false impressions, that doesn't seem
| so bad to me.
|
| Using what you know about an underling to punish them, police
| them, dehumanize them into nothing more than a pile of scores--
| that's neither a new strategy nor one that leads to success.
|
| History seems to show that, while that kind of excess may
| enforce ideological consistency in the near term, it's
| intrinsically costly to the health of the
| [corporation|country|institution]. Over-condition on these
| signals, like any others, and you just produce a culture of
| performative compliance and doublespeak-you make the signal
| useless. And that leaves you in the same anemic information
| environment that you started in, the sort that tends to
| conceals problems until they become overwhelming.
| sangnoir wrote:
| > Ok, exactly which communications does this AI monitor? This
| seems to be internal corporate coms (corporate email, internal
| chats) _which are not subject to privacy_
|
| Expectation of privacy doesn't mean automation of surveillance
| is reasonable or acceptable. If you're out in public, there is
| no expectation of privacy, and I (a human) may take a picture
| of you without breaking any social contract. If someone sets up
| thousands of facial-recognition cameras in public space you
| use, that changes the calculus by a lot, despite no change in
| the underlying lack of expectation of privacy. _Quantity_ has a
| quality of its own.
| freedomben wrote:
| As much as this turns my stomach, this is neither new nor
| unexpected, and honestly this is fairly benign to many of the
| things that companies do such as precision location tracking
| throughout the building, using company-issued phones, and using
| normal security surveillance to monitor "watercooler" talk and
| things like that.
|
| We had our chance to push back against company surveillance well
| over a decade ago, and nobody except privacy advocates like me
| seemed to give a shit. Some very close friends actually told
| (after the company for "saftey reasons" started using RFID
| tracking to track our movements throughout the building using our
| ID cards) that I was being "paranoid" by suggesting that at some
| point it would be openly surveiling employee activities like
| bathroom breaks, watercooler conversation, etc.
|
| Please don't misunderstand, I'm not saying that it's too late,
| that we've lost, and that we get what we deserve (although such
| might be true), but I am pessimistic that we could ever reverse
| this at this late stage point. The cancer has long metastasized
| and buried itself deeply into every nook and cranny.
|
| I don't know if it will work, but this is what I think we should
| do:
|
| 1. Refuse to work at companies that do the most invasive
| surveillance. This means a lot more of your options are gonna be
| to work for yourself or for a startup.
|
| 2. Spread the word in conversations and stuff (when appropriate.
| Don't be obnoxious and shove people's face in it cause that won't
| serve any purpose except alienating them). Seek to educate and
| inform, not to evangelize or argue (though such things do have
| their place, but use discretion).
|
| 3. Shame the companies who do surveillance. I'm not saying be
| obnoxious or in-your-face about it because that won't do anything
| other than make you an asshole, but it's absolutely worth
| bringing it up in a reasonable, fact-based way when appropriate.
| I've told recruiters that I won't work for <big tech co> because
| of this.
|
| 4. Live your values and try not to do business with these
| companies, though such can be hard when many have near-monopoly
| status or a hard-to-avoid position. A full boycott is admirable,
| but you gotta live your life too. Just try to continually re-
| evaluate alternatives and vote with your $.
|
| Anticipating someone dismissing me as an "r/antiwork" person
| (yes, people on the internet and even Hacker News do routinely
| make ridiculous assumptions about people based on zero knowledge
| other than one comment they just read), I actually think ignoring
| this _helps_ the antiwork people by giving them valid points
| about the abuse. If you don 't think they're right, then you
| should be very much on my side here because if nothing happens,
| they will eventually be right.
| earthwalker99 wrote:
| What could anyone do about it? Workers rights have never been
| won without a militant organized labor force. "Advocate" for
| your pet interests all you want, but you aren't accomplishing
| anything for anyone else unless you're organizing, and
| "privacy" is frankly a terrible cause to organize around and
| you seem to already realize that. Focus on wages, time off, and
| improving peoples everyday working conditions, and you will
| eventually get your chance to address privacy. That's just the
| reality of the exploitative system we live under.
| freedomben wrote:
| Thanks, your comment is interesting. Just want to clarify on
| something as we might have different
| definitions/understanding.
|
| > _Workers rights have never been won without a militant
| organized labor force._
|
| Just in the past few years, many companies have started
| giving generous paternal/maternal leave policies. Which
| militant organization brought that change about? Or would you
| not consider that to be workers rights? If the latter, can
| you give me some examples of what you consider to be workers
| rights so we can be on the same page?
| earthwalker99 wrote:
| > _Just in the past few years, many companies have started
| giving generous paternal /maternal leave policies._
|
| Try passing these paternity leave policies as a law and see
| which side those companies are on then. You will quickly
| get a taste of reality. Capital loves to perform generosity
| when they retain the power to stop at a moment's notice and
| lay everybody off with no cause.
|
| The 40 hour work week was won with nothing less than years
| of backbreaking struggle and 4 innocent lives lost at the
| hands of the mitary assigned to break the strike.
|
| US workers have not won any universal workers rights in
| living memory. The decimation of labor unions and the CIO
| in the McCarthy era put an abrupt end to that.
|
| https://wikipedia.org/wiki/Pullman_Strike
| jsg76 wrote:
| 5. Make friends with all the sys admins. Thats the simplest way
| to bypass whatever bullshit corporate robots dream up. Like
| Jurassic Park they love to believe they control everything
| thats going on in the park. In reality their control rooms are
| full of Dennis Nedrys.
| SoftTalker wrote:
| Because sys admins are going to risk their jobs and expose
| themselves to possible civil claims for damages or even
| criminal charges by intentionally circumventing company
| policy for their friends? A few might, I guess. I wouldn't ,
| nor would any that I know.
|
| A better 5. is probably lobby your lawmakers for stronger
| workplace privacy laws.
| freedomben wrote:
| This has been my experience. Excepting maybe small
| startups, gone are the days also where a sysadmin could do
| something and nobody would know. These days things are
| heavily logged and audited, usually by software. With IaC
| and peer review, there's no way one person could do
| something without numerous others knowing.
| bongodongobob wrote:
| Lol. That is a fantasy, just like Jurassic Park.
| neilv wrote:
| I'll add one. (Not just talking about the immediate employee
| surveillance tech, but various questionable tech we develop and
| use, and slippery slopes going forward)
|
| 5. Consider the development or use of evil tech to reflect
| poorly upon the people involved. Like in an interview: "I see
| you were a developer at Dystopitools..." or "I see you were a
| manager at Megajerco..."
|
| "I wasn't in the evildoing department" _might_ be an acceptable
| explanation.
|
| "I was just following orders" seems less likely to be an
| acceptable explanation.
|
| Asking for an explanation might be reasonable, depending on the
| nature of the evildoing.
|
| One benefit I imagine of knowing that we'll have to explain
| ourselves for associating with evil tech is that _individuals
| will even consider that evil tech is a problem_. Otherwise, if
| all people talk about is total comp, tech stacks, and how we
| spend the gobs of money we get paid, it 's easy to forget the
| inconvenient aspects we don't talk about, like if we're getting
| paid to do evil.
|
| Of course, sometimes people have no choice. But other times we
| rationalize, "Sure, I'm actively helping my employer make the
| world worse, but I have to put food on my family", when there
| are less-evil alternatives that also pay enough.
|
| (Precedent: I've previously heard multiple executives say they
| don't want to hire people from particular prestigious tech
| companies, because, in so many words, they assume that the
| person learned bad culture at the company, or something is
| wrong with them for going there in the first place. Adapting
| that idea to evil tech development and application, and
| softening it from denylist to being asked to explain, seems
| worth considering.)
|
| Start stigmatizing evils before a particular kind of evil is so
| ubiquitous that there are no alternatives.
| tomcam wrote:
| May I ask why privacy matters at work? Not trying to be a dick.
| But shouldn't people at work be doing only the things they're
| paid for on company time?
| kelnos wrote:
| Privacy matters wherever you are.
|
| Even at work, most people like to believe they are trusted to
| do what they've said they are going to do without needing to
| be constantly monitored and micromanaged. Anything else is
| unsettling and oppressive.
|
| I would refuse to work at a company that expected me to give
| up my privacy during working hours.
| josho wrote:
| > Privacy matters wherever you are.
|
| Sure, but it doesn't really answer the question.
|
| > Anything else is unsettling and oppressive.
|
| I understand the argument to be made for monitoring
| employees as they move throughout the work site. If there
| is a building fire, then security can validate that I've
| made it out safely and they don't need to spend resources
| trying to find me in the building. I don't find this
| oppressive. I'm not even sure what I'm giving up for having
| my physical presence monitored at a work building.
|
| I also agree that privacy matters, and I'll advocate for
| it. But, I frequently struggle to express why it matters.
| tomcam wrote:
| But can't you use your own phone? Also, how would you feel
| if you were paying employees to act contrary to your
| interests on your time?
| jurynulifcation wrote:
| Privacy matters at work because the interests of the employee
| and the employer are in tension. Do you expect an employee to
| leave every personal item out of work? Should employees
| expect to not speak of their personal lives, or to have
| artifacts of their personal life leak into the workplace? To
| say that the boundary should be absolute is hopelessly naive.
| Privacy matters because it is a leaky barrier, and the
| employer should not be capable of retaining personal
| information they can use to emotionally extort or legally
| strong-arm their employee when it's useful for them.
| tomcam wrote:
| I kind of do expect expect to leave their private matters
| at home. Or the very least, to handle them during break
| time on their own phones. This does not seem at all
| oppressive to me.
| wolverine876 wrote:
| > I am pessimistic ... etc
|
| Your problem is not management, it's you (if this comment
| represents you) and your kind. The comment is lazy and just
| following an absurd fashion: wallowing in despair. Your advice
| is useless unless people organize and act. You need to do
| better; we need to do better. Though management could not ask
| for a 'better' worker, a better agent even (I don't think you
| are, but you might as well be).
|
| People facing far more difficult odds have organized and won.
| Think of all the things our ancestors have overcome to build
| the world we live in, and what do we contribute? 'It's too
| hard'? 'I'm pessimistic'? No matter what we were working on,
| I'd eject you as soon as possible.
| analog31 wrote:
| >>>> Using the anonymized data in Aware's analytics product,
| clients can see how employees of a certain age group or in a
| particular geography are responding to a new corporate policy or
| marketing campaign
|
| >>>> "It won't have names of people, to protect the privacy,"
| Schumann said. Rather, he said, clients will see that "maybe the
| workforce over the age of 40 in this part of the United States is
| seeing the changes to [a] policy very negatively because of the
| cost, but everybody else outside of that age group and location
| sees it positively because it impacts them in a different way."
|
| QFT. It's always the over-40 who are under suspicion.
|
| Disclosure: Old.
| jSully24 wrote:
| >>> It's always the over-40 who are under suspicion.
|
| Because of our (over 40s) years of watching bad policy after
| bad policy destroying our work places, and speaking up.
| babyshake wrote:
| There is a reason companies love hiring young college grads.
| throwup238 wrote:
| If that's the intention everyone who uses this software opens
| themselves up to age discrimination lawsuits (40 years of age
| is the cutoff). This is either a really dumb way to market a
| product or it's actually meant to detect those discriminatory
| policies so employers can undo them _before_ someone decides to
| sue.
| IncreasePosts wrote:
| I think you're assuming the software would be used to punish
| people who don't like the policies...but what if it's just
| used to help accommodate groups who don't like the policies?
| kevin_b_er wrote:
| You are hoping for altruistic behavior in a highly
| capitalist environment. This is incorrect.
|
| It will not be used for anything but control and
| punishment.
| deelowe wrote:
| When your aim is constant growth, you're going to build your
| strategy towards what resonated with younger age groups.
| Aurornis wrote:
| In my experience, it's always the over-40 group who are trying
| to implement these features to keep an eye on the young
| troublemakers.
|
| It's certainly not Gen Z kids in leadership positions
| implementing these AI snooping policies at these mega
| corporations.
| analog31 wrote:
| Since the edit window is closed, I'll just clarify that over-40
| is a protected class in the US.
| TeMPOraL wrote:
| > _a new corporate policy or marketing campaign_
|
| Reality: 50% of the workers never opened the e-mail. For the
| other half, average reading time was under 5 seconds. 40% of
| those who interacted with the e-mail deleted it. Two people
| "accidentally" reported it as junk mail.
| bagels wrote:
| Will it be used to find union organizers? Walmart and Starbucks
| have been in the news for suppressing labor organization.
| Erratic6576 wrote:
| And Tesla, and Apple, and Amazon
|
| "There's a class warfare and mine is winning" WB
| bruckie wrote:
| WB = Warren Buffett, for those wondering (like I was)
|
| https://www.nytimes.com/2006/11/26/business/yourmoney/26ever.
| ..
| nijave wrote:
| Yes
| costanzaDynasty wrote:
| This, the Gem AI revelation from last week; a bunch of companies
| are going to get clobbered by the massive fines. The question is
| if the fines will ever be worse than the benefit of the misdeed.
| Probably not.
| prdonahue wrote:
| If Delta's "AI" support they force you through (even as Diamond
| Medallion) before you can chat with a representative is an
| indication of their competency, this monitoring is doomed.
| happytiger wrote:
| The real value if removing encryption is access for AI.
|
| If we don't have a comprehensive privacy legislation, something
| like a bill of rights for privacy, tied to individuals, AI will
| become a tool of totalitarianism and total information thought
| control. Full stop.
|
| You can look at it though _todays lens_ and argue this is just
| another application of machine learning, but it isn't just a
| slippery slope but becomes a landslide as soon as you apply more
| sophisticated future digital intelligences to it.
| devsda wrote:
| Slowly Privacy is being redefined by many tech companies as
| "Your data is completely private and will never be viewed by
| any human".
|
| That's corporate speak for "we will analyze your data by all
| other available means" and AI is the perfect tool for that.
| al_borland wrote:
| I'd almost rather have the human. At least humans can forget.
| none_to_remain wrote:
| Yes, the boss can see the company Slack.
| laurex wrote:
| People often underestimate the effects of losing privacy. A lot
| of comments here seem to reflect the idea that "If I'm employed
| by CompanyY then I'm owned by them." We are spending a majority
| of our waking hours working, often, and being trained to believe
| that an authority's ownership of technical systems means that not
| only everything we do with other people via those systems belongs
| to an employer, but that our inadvertent inclusion of ourselves
| as humans and not simply task performers is "our own fault." As
| humans, our psychological wellbeing rests on a sense of autonomy
| and these systems directly suppress that.
| sleight42 wrote:
| In some ways, these businesses are akin to the CCCP: they want
| to own our hearts and minds.
|
| They want our whole selves. They want every drop of what they
| can get out of us.
|
| This is so often what they mean when they say that they want
| people who are "passionate" about the work. They want us
| obsessed and fixated.
|
| So they want to more than own us.
|
| This isn't all businesses but increasingly this is becoming
| normalized.
| infamouscow wrote:
| This only happens because people allow it.
|
| If these same businesses all decided they were going to take
| your first born as collateral, I suspect a lot of CEOs and
| chairmen would be swinging from lamp posts (as they should)
| within a week.
| sleight42 wrote:
| Sure. Now.
|
| It's an Overton Window. Business executives and their big
| investors are the feudal overlords who are decreasingly
| benevolent liege lords.
|
| What is abhorrent now slowly becomes normalized. Look at
| CEOs who said, shortly after some of the more brutal start
| opining about how everyone needs to RTO or GTFO At least
| one of those CEOs said they needed a bit more Elon in them.
| They meant his ruthlessness with his employees.
|
| And the window moves.
| badrabbit wrote:
| I don't know what you and others are on about. Work systems
| have no expectation of privacy. Just how it is.
|
| If there was a law where certain types of communication had
| expectation of privacy, similar to how toilets at work have an
| expectation of privacy, that'd be nice. Some countries like
| Germany do have this, it's not always a good thing but if
| that's what you want then have your state's lawmakers pass that
| law.
|
| I worked in environments where hot mic's and cameras are being
| used to monitor what we say and do in addition to screen
| recording, chat monitoring (slack,teams,etc... all have a
| feature), browser monitoring,etc... and that's just the tip of
| the ice berg, the layers of shady shit they do, because in
| their messed up heads employee and slave are not all that
| different, is endless. Is it illegal? That's the wrong
| question, the right question is obviously most people don't
| want any of this because it is against their advantage, but can
| the government which derives it's power and authority from most
| people represent them and legislate to begin with? The answer
| is no it cannot.
|
| Can my employer record conversations and video using my work
| laptop in my home, I mean just petty managers not even for a
| legitimate inquiry? Yes, they can get away with that. Why is
| there no law to govern this? Because legislation does not
| follow the will of the people, it follows the will of fund
| raisers.
|
| Companies are people too but with bigger political
| contributions. "For the people, by the people, of the people",
| people in that context now means mostly corporate persons and
| their interest trumps the individuals'.
| xyst wrote:
| US PATRIOT Act essentially normalized domestic spying with FISA
| courts rubber stamping warrants in the name of "war on terror".
| Privacy advocates lost once those towers fell on 2001-09-11.
|
| Millennials pushed the aging fossils into the digital world
| (e-banking/mobile banking, almost everything is digitalized
| today). Social media changed the way we interact with people. The
| concept of privacy continues to erode.
|
| Gen Z and future generations continuing the trend. Often just
| giving away their information for likes/impressions/views.
| Everything has been recorded in one way or another. Relationships
| also built on "no privacy" (couples often know their phone
| passcodes ...). Children raised in dystopian households where
| privacy does not exist (parents have access to everything).
| Future generations conditioned to obey.
|
| Now it's bleeding from private life into the work with invasive
| AI models deployed to determine seditious activity as deemed by
| Karen in HR or Jeff from C-level exec suite because he gets a
| nice kickback (via speaker fees) after getting approached by AI
| company.
|
| Next level is totalitarian, surveillance state monitoring as
| depicted in 1984. Skip the middleman (private companies vacuuming
| up all the data and selling to govt).
|
| Data. DNA. No stone unturned.
| coldtea wrote:
| > _Skip the middleman (private companies vacuuming up all the
| data and selling to govt)._
|
| Why skip it? Keep it and enlarge it. Then you get to have your
| cake (total surveilance) and eat it too (it's not the
| government doing it! we're just buying those private sector
| data).
|
| Same way you can censor as a government and still be all about
| "free speech"! You just let the private sector do the censoring
| - with a little carrot and stick encouragement from you, they
| know what they have to do and will even jump at the chance
| without you having to explicitly ask them.
| TeMPOraL wrote:
| You're making it sound like there's some kind of plan behind
| it. There is no conspiracy. There won't be a _direct_ "skip the
| middleman" totalitarian level, because governments are too busy
| chasing their own tails and smelling the farts in a 4-year
| election cycle to consider anything of a broader scope.
| Meanwhile, private companies are chasing easy money at the
| expense of anything that's good, honest or nice, like they
| always do.
|
| We walked into this situation through decades of individual
| selfishness, aggregated.
| snypher wrote:
| This is why I believe that it will be a corporate-ocracy, as
| the corporation has a much longer lifespan than some
| political short-term influence. However corporations can also
| 'fly too close to the flame' in the political sphere. Whoever
| comes out on top will be the product of their own political
| ideology and not the red-vs-blue politics.
| williamcotton wrote:
| The concept of the public space is also eroding and it is
| overshadowed on these forums by privacy advocates.
|
| Privacy is indeed important but only up to the limit that it
| intrudes upon the public accountability necessary for a
| republic to function.
|
| We seem fine with letting governments and large corporations
| know where we live and how much we pay in taxes yet are
| hesitant to make this information public due to fear of our
| fellow citizens. That is a complete erosion of social trust and
| cohesion.
|
| Someone please put a science-fiction sheen on the death of the
| public sphere. There is more than the dominant 1984-inspired
| narratives at play!
| stupidog wrote:
| My company (healthcare) has started doing the same thing, at
| least with phone calls.
| kilolima wrote:
| Corporate workplace spying on employee communication became
| prevalent in the 90s to prevent discrimination lawsuits. This
| predates what we think of as our contemporary "surveilance
| capitalism". There were a series of court decisions that
| enshrined this practice in law and since then the limits of
| proscribed speech have expanded.
|
| "The Unwanted Gaze" is a good read on this subject.
| <https://www.penguinrandomhouse.com/books/157389/the-unwanted...>
___________________________________________________________________
(page generated 2024-02-10 23:01 UTC)