[HN Gopher] Social media is broken - a new report offers ways to...
___________________________________________________________________
Social media is broken - a new report offers ways to fix it
Author : dsr12
Score : 97 points
Date : 2021-07-04 14:26 UTC (8 hours ago)
(HTM) web link (mitsloan.mit.edu)
(TXT) w3m dump (mitsloan.mit.edu)
| jscipione wrote:
| "There's always been this division between your right to speak
| and your right to have a megaphone that reaches hundreds of
| millions of people," she said.
|
| One is the freedom of speech, the other is freedom of the press,
| we as humans have the right to both. Yet never before in history
| we have had access to the press power of the social media
| revolution. For a time, the gatekeepers took a hands-off
| approach, but that time is over.
|
| End Internet censorship now. We need platforms where the battle
| of ideas can be fought and won on a level playing field without
| the creeping hand of censorship in the name of "combating
| disinformation" getting in the way.
|
| Fact check all you want, may you win the battle of ideas. But
| talk of censorship, even from MIT, stinks of totalitarianism.
| wussboy wrote:
| I would agree with you if humans were good at fact checking and
| then changing their minds based on those facts. But humans do
| not work that way and never will. Even you yourself don't work
| that way because you won't check to see if I'm lying, and if
| you do and see that I am not and that the research soundly
| supports my position, you won't change your mind.
| MilnerRoute wrote:
| First, let's explore this a little. Let's say it were possible
| to lie to a billion people -- a lie that could really and truly
| end lives. Hypothetically....would you stand back and let this
| happen?
|
| Because that's the problem with the absolutist position against
| "internet censorship." It denies, ever, the possibility of a
| harmful kind of speech which should in fact be acted against.
| People can list lots of examples. (Doxxing. Calls for violence
| against individuals. Child pornography. Dangerous medical
| misinformation. Copyrighted information.) Are we really okay
| with a billion people getting all of these things -- rather
| than one person being "censored"?
|
| And the other problem is we're not talking about a government
| with a constitution... We're talking about Instagram and
| Pinterest (and other social media companies). I've heard it
| said that they very conveniently claimed "We're adopting the
| same absolutist free speech principles of a country" mostly as
| a ruse to keep from having to invest in any kind of monitoring
| of their services.
| thegrimmest wrote:
| > _a lie that could really and truly end lives_
|
| Lies don't end lives, people do. You can't say that a lie
| "caused" someone to do something. It may have influenced
| them, but they are ultimately responsible for their actions.
| More generally, I don't think we should be in the business of
| policing second or third order effects. Responsibility
| ultimately ends with the perpetrator of an action. If someone
| lies to one (or many) people, and some of those people go on
| to murder: punish the murderers, move on with life.
|
| > _Doxxing_
|
| Not sure what's wrong with this TBH - publishing public
| information shouldn't be a crime. Some sites have anti-
| doxxing policies, some don't.
|
| > _Calls for violence against individuals_
|
| Direct threats of violence are already illegal. People that
| make them should be arrested.
|
| > _Child pornography_
|
| Already illegal - find and imprison the pornographers. We
| already do rather a good job at this.
|
| > _Dangerous medical misinformation_
|
| In addition to the obvious idea that people are responsible
| for their own worldviews, any process to distinguish
| "misinformation" from "information" requires an oracle that
| everyone can agree to trust. We have no such oracle.
|
| > _Copyrighted information._
|
| Reproducing copyrighted material is already illegal and
| pretty well enforced.
|
| > _mostly as a ruse to keep from having to invest in any kind
| of monitoring of their services._
|
| Why do you think it's OK to impose these costs onto
| businesses? Why should companies who effectively provide a
| digital bulletin board be responsible for policing its
| contents? Why not have... you know... the police be
| responsible for policing? Then you can clearly and directly
| send any complaints about their effectiveness to your local
| representatives, who will actually be empowered to do
| something about them.
| MilnerRoute wrote:
| We totally agree that many of these things are, indeed,
| illegal.
|
| But if that's the case, then at some point a society is
| also going to need to consider how it's going to also
| address the distributing of those things which are illegal.
| (If something can still really be disseminated to a billion
| people -- then what was the point of even making it illegal
| in the first place?)
|
| For decades and decades all publishers have been
| responsible for the content they publish. (See libel laws,
| just for example.) So it just seems irresponsible to now
| concede the existence of vast unpatrolled online empires
| making billions of dollars while simultaneously creating
| dangerous (and illegal) situations which others will then
| need to police for them.
| thegrimmest wrote:
| > We totally agree that many of these things are, indeed,
| illegal.
|
| Good! Then we don't need any new laws or policy changes.
| We just need to empower law enforcement to do their jobs.
| As I've mentioned, they seem pretty empowered already.
|
| Publishers indeed have been responsible for content.
| Sites like Instagram or Pinterest are hardly publishers
| however, they're more akin to the bulletin boards that
| used to be ubiquitous in public places (With Pinterest
| it's literally in the name). Anyone can post whatever
| they want without any editorial process. If someone
| posted a murder contract on such a bulletin board, would
| the board owner (say your local grocery store) be
| responsible? Hardly.
|
| Vast unpatrolled online empires where people can say
| whatever they want sounds pretty good to me actually.
| It's freedom of speech in action. Anything actually
| illegal or really dangerous is policed pretty well (say
| compared to TOR, and even there criminals are not
| immune). Otherwise just leave people be. No one "creates
| a dangerous situation" just by saying something online.
| It takes more than that.
| [deleted]
| AnimalMuppet wrote:
| > We need platforms where the battle of ideas can be fought and
| won on a level playing field without the creeping hand of
| censorship in the name of "combating disinformation" getting in
| the way.
|
| In a world where the Russians have a professional and very
| active disinformation organization, your "level playing field"
| looks more like a military conducting a massacre against a
| civilian population...
| gpsx wrote:
| There is so much emphasis on fake news but I think any story can
| be shaped by _which_ facts are discussed. Face it, you can't put
| a complete nuanced view in the sound bite world of social media.
|
| At the same time, I think people enjoy being outraged. You get an
| actual physical energy burst, and you can think how superior you
| are to the misguided idiots you are talking about.
|
| I don't think social media is tricking us into this behavior, it
| is just helping us do it better. It would be good if we could
| figure out how to discourage the behavior in the first place.
| daenz wrote:
| All of these efforts are being pushed under the guise of making
| sure people hear "the truth." The Real Truth. As long as that is
| their emphasis, it makes it very difficult to argue against any
| draconian measures people want to put in place. Because, why
| would you be against any measures that help people hear The Real
| Truth? That means you would be Anti-Truth, and nobody wants
| someone who is Anti-Truth.
|
| Are enough people big enough suckers to allow this to happen
| under these false promises? We will find out. If you do support
| measures around controlling what people see and say, at least do
| the rest of us the courtesy and don't pretend that it will bring
| about a utopia of free exchange.
| 0xbadcafebee wrote:
| Social media as it is now ("connect _everyone_ together ") is
| definitely a bad idea. Communities of people only work when
| they're close knit or share an identity, not when you have mash-
| ups of different people in different communities all randomly
| tied together and forced to address each other's opinions. You
| might as well have a barbeque where Hell's Angels, an LGBTQ
| support group, and a Catholic church study group all sit in a
| circle for no reason.
|
| Online groups only work when they have a uniform identity, an
| ingroup, a particular set of shared values. They can have other
| ideas or values, but it needs to be clear to them that they
| should only bring up that group's values. Communities can also be
| toxic and stupid, but they do keep cohesion when they're all
| aligned on a core theme.
| lemoncookiechip wrote:
| This might sound rude and doesn't add much to the conversation,
| but this issue can't be fixed unless you "fix" people.
|
| We can be eloquent, humanitarian, calm and polite, but the
| reality is that we can also be cruel, vindictive little animals
| that seek out to ruin everyone's day. There is no fixing social
| media because our nature doesn't allow it, and you can fix the
| business aspect, but not how the users utilize it, unless you
| restrict them to the point we can't call it a social media
| platform anymore.
|
| TL:DR Fixing human nature isn't going to happen. The more popular
| a platform is, the more positive and negative voices you'll have
| in the platform, and we all know that we linger and give more
| attention to the negative more so than the positive.
| megabless123 wrote:
| Wonderful idea in spirit, albeit nearly every proposal runs
| counter to the social media company's profit incentive.
| lanevorockz wrote:
| The first note they provide is just about Burning Galileo at the
| stake. This strategy does not work and the only correct way is to
| inform the public. Education is the ONLY solution and this
| indoctrination path will only result in war.
| uniqueid wrote:
| I agree we need more education if people think Galileo was
| burnt at the stake.
| perihelions wrote:
| Indeed! That was a different Italian heliocentrist, Giordano
| Bruno. (He has an excellent statue in Rome on the Campo de
| Fiori, un-subtly facing the Vatican).
|
| https://en.wikipedia.org/wiki/Giordano_Bruno
| bopbeepboop wrote:
| Particularly because the censors are shown to lie for their
| benefit:
|
| - covering up lab leak for Fauci
|
| - dismissing Ivermectin for partisan reasons
|
| - hiding the antisemitism and general violence of BLM
|
| - hiding that BLM is a Marxist organization
|
| - covering for Cuomo and Whitmer killing the elderly with
| disastrous policies
|
| - lying about CRT, which people are rightly opposing as modern
| racism
|
| - dismissing any concern about electronic voting as irrational,
| amid the two largest cyberattacks in US history (SolarWinds +
| MS Exchange)
|
| Etc.
|
| Their whole system is already based on lies -- something
| foreign powers have been able to utilize in PSYOPs by, eg,
| highlighting attacks on Jewish neighborhoods during the riots.
|
| Lying to the public makes a weak country.
| ta8645 wrote:
| You're 100% correct and the people down voting should think
| about what happens if they lose control of the censors and
| censorship starts being run by right-wing zealots instead of
| the far left.
| fatsdomino001 wrote:
| Sad seeing the parent comment getting flagged. Censorship
| by flagging a very real problem on HN, but from what I can
| tell its mainly the users abusing it not the mod team who
| are generally well-balanced.
| ta8645 wrote:
| Yeah, I don't think of a fair moderation system as
| censorship. Groups of people should have the freedom to
| keep a discussion focused on whatever they desire.
|
| Still, it's disappointing just how easy it is for us to
| be completely blind to things that go against our own
| preferences, even when they're blindingly obvious to
| others.
| fatsdomino001 wrote:
| I agree it's so easy not to see our own biases reflected
| in our own censorship actions. We all have blind spots,
| but censoring parts of a discussion simply because you
| don't like them isn't healthy.
| bopbeepboop wrote:
| I'm actually shadowbanned, so all my comments are
| automatically flagged.
|
| My opinions are an "inherent flame war", according to HN
| mods -- even when I state them calmly and carefully cite
| sources.
|
| It's not that individuals on HN believe this individual
| comment is bad -- the leadership of HN is trying to
| silence my views to shape the community's thoughts.
|
| I continue to post anyway, to slowly spread awareness of
| how HN uses shadowbans to suppress "wrongthink".
| jessaustin wrote:
| You've identified a number of real things here, but you've
| also swallowed a few whoppers. "CRT" was originally something
| that a few scholars discussed in university settings. Now it
| is mostly a non-existent right-wing bogeyman that YouTube
| weirdos hype for clicks. No one is going to teach your little
| daughters and sons that they are inherently evil. (Although,
| isn't that the Christian doctrine of Original Sin?) However,
| some of their better teachers might teach some history that
| you didn't learn in school.
| bopbeepboop wrote:
| You're lying about CRT.
|
| - - -
|
| CRT is advocated for by major institutions, such as Amazon
| HR, Disney HR, Coca-Cola HR, and the US DoD.
|
| Eg, Coca-Cola: https://www.youtube.com/watch?v=55B3eLvH-LY
|
| Can you imagine if a major corporation told black employees
| to "be less black" or Asian employees to "be less Asian"?
|
| That's racism.
|
| - - -
|
| CRT is the basis for the recently suspended Biden-
| administration racist farm subsidy -- which was blocked in
| court because its racism violates civil rights laws.
|
| CRT-endorsing Democrats have been trying to repeal civil
| rights laws so they can engage in government racism.
|
| See WA: https://ballotpedia.org/Washington_Referendum_88,_V
| ote_on_I-...
|
| See CA: https://ballotpedia.org/California_Proposition_16,_
| Repeal_Pr...
|
| - - -
|
| CRT is racists myths in schools that millions upon millions
| of parents have witnessed for themselves.
|
| Here's a video of parents denouncing it after witnessing
| the racism first hand:
|
| https://www.youtube.com/watch?v=ZxRm8ZXaBd0
|
| - - -
|
| To address a few things you said:
|
| > Now it is mostly a non-existent right-wing bogeyman that
| YouTube weirdos hype for clicks.
|
| I think it's telling people have to lie about CRT, because
| even the proponents know it's indefensible racism.
|
| > However, some of their better teachers might teach some
| history that you didn't learn in school.
|
| Making ad hominems that I'm uneducated because you can't
| make a positive case for a racist theory like CRT speaks to
| the moral and intellectual bankruptcy of CRT proponents.
|
| CRT is just apologetics for racism.
| ta8645 wrote:
| You're being naive.
| https://www.youtube.com/watch?v=cRXNaUz5LGY
| jessaustin wrote:
| "Indicrat" is certainly one of those "YouTube weirdos"
| referenced above. The lecture audience seen in that clip
| were old white women, not young white children. If those
| women hadn't heard about racism before then they should
| have.
|
| Also, obviously the woman in the form-fitting attire was
| using intentionally provocative language. She doesn't
| really believe in demons. She was even giggling during
| part of the clip. There is nothing wrong with provocative
| language, unless one is a cancel-culture moron with
| easily hurt feelings.
| ta8645 wrote:
| You're refusing to just accept that CRT is in fact
| embraced, advocated, and shaping discourse and attitudes
| among people. This example did not spring up out of the
| blue. Your dismissal of it as a right-wing fiction is
| either naive, or dishonest.
| jessaustin wrote:
| If that were the case, CRT Chickens Little would be able
| to come up with _some_ evidence of it. Actual evidence,
| not the amusing video linked above. Which precious white
| children have run crying from elementary school?
|
| To be clear: I'm sure that (adult) white Americans have
| been informed of their racism. I'm sure that 1950s
| lynchings and other uncomfortably recent events have been
| taught in some history courses. I'm sure that racist
| aspects of various contemporary institutions have been
| discussed. I'm sure that you have been very uncomfortable
| with all of that.
|
| All of those are _good_ things. Stop hiding behind
| hypothetical traumatized children.
| thegrimmest wrote:
| So you agree that all white people are racist, and carry
| responsibility for lynchings in the 1950s? Even recent
| immigrants who have no relation to the perpetrators of
| said lynchings? We're supposed to be uncomfortable too,
| just because we bear a passing resemblance to some
| assholes? Isn't that a bit... racist?
|
| Also, so far as CRT being taught to children:
| https://nypost.com/2021/04/13/nyc-teacher-were-damaging-
| kids...
|
| Please explain to me how a young child can be an
| oppressor.
| jessaustin wrote:
| That appears to be about a parochial high school in NYC.
| It's my impression that we're supposed to be very
| concerned about "young children" in public elementary
| schools in the heartland. There's really no accounting
| for the shenanigans these weird churches will get up to.
|
| I don't consider myself responsible for lynchings that
| occurred before my birth. I feel like I have to take a
| bit of blame for e.g. Michael Brown's murder, since I've
| voted in Missouri for years. But if someone wants to
| criticize me for both of them, it won't break my heart.
| Occasionally humans face discomfort.
| thegrimmest wrote:
| > _That appears to be about a parochial high school in
| NYC_
|
| How about all of these then:
| https://www.wsj.com/articles/federal-lawsuits-say-
| antiracism...
|
| Acting like these are small isolated events and don't
| represent a wider shift in educational policy just
| doesn't seem sincere. This article identifies some of the
| very same practices of using race to determine whether
| students fall into oppressor or oppressed classes.
| Excerpts:
|
| > _She alleges that teachers and students are required to
| participate in racially segregated antiracist exercises
| and that teachers are required to teach material
| depicting white people as inherently racist oppressors_
|
| > _...a biracial high-school student who claims he
| received a failing grade in a required "Sociology of
| Change" class because he declined to complete an
| assignment that required students to identify their
| gender, racial and religious identities to determine
| whether they qualified as oppressors_
|
| This is Racism.
|
| > _I feel like I have to take a bit of blame for e.g.
| Michael Brown 's murder, since I've voted in Missouri for
| years._
|
| If you feel like taking some of the blame off of Darren
| Wilson's shoulders, I can't stop you. I would however
| object to any _placing_ of blame onto other Missouri
| residents who didn 't kill anybody. It seems like this
| would allude to a disagreement about whether
| responsibility for action is fundamentally collective or
| individual.
| smoldesu wrote:
| In any case, it's a total strawman in the context of this
| argument. We could waste our time trying to reconcile two
| extremist identities that we don't align with, or we
| could get back on topic and try to find a mutually
| amicable solution to this problem.
|
| And people wonder why nothing bipartisan ever gets
| done...
| ta8645 wrote:
| No, you're missing the point completely. You can't
| address something by pretending it is inconsequential or
| worse doesn't really exist (ie. gaslighting). This
| narrative that far-left extremist views are only right-
| wing paranoia is dishonest and short-circuits any
| bipartisan debate about how to proceed.
|
| The person above even defending this particular example
| saying it was colorful rhetoric and not serious. Would he
| have done that if it had been said about Jews or Blacks?
| Until we can have honest discussion with integrity
| instead of partisanship, nothing will ever get done.
|
| My belief is that we should take the extremists on our
| side of the debate to task. The left should do everything
| they can to diffuse the far-left... call out unreasonable
| positions of the left. And the right should do the same
| against the far-right. It would go a long way to making
| the center more inhabitable and productive.
| jessaustin wrote:
| The lecturer only said that white Americans are racist.
| That is certainly true. (I.e., we certainly are racist.)
| What analogous description would you like to suggest
| "about Jews or Blacks"? You will successfully continue
| the racism that blacks face in USA as long as you can
| convince idiots that it's "extreme" to acknowledge
| racism. Unfortunately for you, that type of idiot is
| growing more rare over time.
| ta8645 wrote:
| No, she also said they were demons. A notion which you
| excused. You read her mind and "knew" that she wasn't
| being serious. You gave her the most charitable
| interpretation. Would you do the same for some crazy
| right winger, or would you call him a Nazi?
|
| But anyway, it's clear you're a full on proponent of CRT
| and fully engulfed in its deluded precepts. Yet above you
| were lying and trying to pretend it wasn't a mainstream
| or prevalent belief system. You are a dishonest person.
| And maybe that is because deep down you know you are
| spewing hateful and destructive ideas.
| jessaustin wrote:
| Wow, dude. Lots of projection here. I've never called
| anyone a "Nazi". CRT critiques the racism of the
| mainstream, so it's definitely not mainstream itself.
| That may be why some people are so afraid of it, because
| they desperately love the mainstream status quo in which
| their fear and hatred for BIPoCs is reflected in powerful
| institutions. I'm not a "proponent" of CRT; it's just
| obviously correct to anyone whose view isn't obstructed
| by his colon. Seeing a CRT person behind every tree is
| like last year when thousands of mom-and-pop convenience
| stores were supposedly burnt down by white antifas, of
| whom there are probably about 200 nationwide.
| tzs wrote:
| Generally, producing true and accurate information takes more
| effort than producing misinformation, and on the consumption
| side understanding a refutation of something generally takes
| more time than understanding the original thing.
|
| This leads to an inherent advantage for the producers of
| misinformation.
|
| You approach worked say 50 years ago, when most of us had
| limited sources of information and so had time to actually see
| and understand the refutations of most attempts at
| misinformation.
|
| Nowadays, we've almost all got more sources of information
| competing for our attention than we have time to deal with. It
| is much less likely we'll see the answers to the
| misinformation, or have time to deal with them. Furthermore,
| with what we see algorithmically determined to maximize
| engagement, we are probably going to be shown more
| misinformation from the same or related sources instead of the
| refutation.
| dionian wrote:
| Ah yes we need to "rescue truth" by "policing content"
| [deleted]
| darthrupert wrote:
| Yes, we really do. Western ideals of information freedom are
| all good but they are incompatible with modern cyber warfare
| methods.
| ergot_vacation wrote:
| The solution to misinformation from without cannot be the
| tyrannical coercion toward state-sponsored misinformation
| from within. Free speech and a free society have always been
| vulnerable and difficult because bad people can come in and
| say bad things to stupid people and make bad things happen.
| This is not a new problem. But throwing out that freedom is
| not a solution. You will not like the results.
| darthrupert wrote:
| You really need to propose solutions, not just say that the
| obvious option is not going to work.
|
| Free speech has never been as vulnerable as it is now. This
| is a new situation that absolutely requires for new
| solutions.
| dane-pgp wrote:
| > You really need to propose solutions
|
| No, you're the one suggesting we abandon "Western ideals
| of information freedom", so _you_ need to propose what
| ideals we should have instead, and what practical changes
| those new ideals will lead to.
| zarkov99 wrote:
| This seems completely wrong to me. It's just more of the same
| nonsense. Social media platforms are not to be trusted as
| arbiters of truth. That isn't their job and if they are "held
| accountable" for disinformation they will simply double down on
| censuring any speech that deviates from main stream orthodoxy.
| What we need is not a crack down on lies but a return of
| institutions we can trust.
| glafa wrote:
| my criticism on some of the potential solutions addressed in the
| article (number = number in the article):
|
| 1. stop the spread of fake new: This has already been happening,
| most of big offenders are gone (e.g Alex Jones) but what should
| be noted is that this can (and has) lead to backlash. The fake
| news ive seen always comes from tiny accounts that have
| unexpectedly viral content usually taken and reworded from banned
| sources (Alex Jones et al), I don't think cracking down on the
| most prolific offenders will necessarily be the fix. The actual
| report is way clearer and thorough on this then the article.
|
| 3. Lack of regulation for social media companies: I feel like if
| their is global regulation (which there should be) many countries
| will just ban it (e.g China, Turkey and Russia) and still lead to
| the same balkanization.
|
| 5. polarizing Algorithms: I don't think slowing online
| interactions will solve this. If someone wants to be racist they
| will be racist. I think this will just bring annoyance.
|
| 6. better social media business models: they say that they worry
| that "the best, fact-checked information is available only behind
| a paywall" but that is already the case!
|
| I recommend people go to the actual report, the 25 solutions is
| on page 16
| yawnxyz wrote:
| None of these solutions offer any advice on the business model,
| which means they're dead on arrival.
|
| For a new social media to be "fixed", the company behind it needs
| to have the incentives and business model to steer towards more
| healthy behavior. None of these cover that.
|
| We've had plenty of social media startups try to fix social, like
| Path, but ultimately they've failed bc a "fixed social media
| model" almost seems antithetical to "a good business."
|
| I'm all for Pinterest and I think it's relatively healthier than
| the rest, but they barely have a functional business...
| raffraffraff wrote:
| I feel like I'm missing something with Pinterest. My experience
| of it is that it turns up annoyingly often in search engines,
| and has hits on pinterest.co.uk, pinterest.ie, pinterest.fr,
| pinterest.com etc. It's one of the sites that prompted me to
| install a browser addon that filters out Google hits I've
| blacklisted every one of their domains.
| olah_1 wrote:
| > ultimately they've failed bc a "fixed social media model"
| almost seems antithetical to "a good business."
|
| Local governments like cities / towns should provide federated
| apps like Matrix / Activity Pub as a service.
|
| Your tax money should pay for the maintenance of the servers.
| MilnerRoute wrote:
| I've always wanted to see that happen. When television came
| along, governments funded public broadcasting channels. Now
| that social media has come along...why couldn't there also be
| a publicly-funded social media service?
| noaheverett wrote:
| I think we're at a shifting point where people would be willing
| to pay for social media vs seeing ads and/or having their data
| being sold. I recently launched Glue[1] in my hope to have a
| social network that respects user privacy, offers an ad-free
| option and selfishly scratches the itch for features I wanted.
|
| I ran my previous startup (Twitpic) completely on ads and it
| wasn't an enjoyable experience from the business side. It also
| does not (usually) align with the user's best interest. Ad
| business models requires attention to feed it, which in turn
| requires social media companies to build features to get as
| many eye balls for as long as they can on their app.
|
| I also don't think paid only is the answer either for most
| cases. Some are ok with seeing ads in exchange to not have to
| pay to use. I'm curious to see if there is a balance that can
| be struck with services offering both options.
|
| Brave's new search engine[2] is another example saying they
| will offer a paid option in the future. I used their beta and
| it's solid. I'd be willing to pay for it to be free of Google
| and help sustain it.
|
| [1] https://glue.im/glue/glue-a-social-network-that-respects-
| you...
|
| [2] https://brave.com/brave-search/
| cvwright wrote:
| I hope you're right about people being ready to pay. And I
| think you are. There's a huge un-tapped market just in the
| US, of people who have mostly or totally checked out of the
| existing platforms.
|
| Glue looks cool. How does this compare to something like
| Mastodon?
|
| I'm working on something related, called Circles [1]. It
| builds on Matrix for decentralization and E2E encryption.
| We're also in beta, hoping to launch later this month.
|
| [1] https://github.com/KombuchaPrivacy/circles-ios
| Sanzig wrote:
| I really like the idea of Circles. Are you planning an
| Android app in the near future?
| noaheverett wrote:
| Hear hear! I deleted my Facebook 2007'ish and never joined
| Instagram, Twitter is only traditional social media site I
| use, so I can attest to those checking out (or never
| joining). Even with Twitter, I have to make an effort to
| not "doom" scroll before bed.
|
| Circles looks cool as decentralized and E2E are fascinating
| to me. Glue doesn't have a whole lot in common with
| Mastadon, but similar microblogging features I believe.
|
| Hit me up sometime, would like to hear more about your
| project. - noah@ark.fm
| cvwright wrote:
| > Even with Twitter, I have to make an effort to not
| "doom" scroll before bed.
|
| Yeah, like the man said, the greatest minds of our
| generation have been focused on getting people to look at
| ads. They're scary good at it.
| jillesvangurp wrote:
| The business models are the problem. It's far too lucrative to
| do what the big social media companies are doing, which is to
| exploit the social behavior of their users in return for hard
| advertising cash. Users are of course willing victims here but
| if you step back a little, social media is mostly a very low
| tech business of connecting people with each other via "feeds"
| of information. As long as they "engage" with it, basically you
| are printing money.
|
| There is no incentive to fix that because it isn't broken for
| the likes of Facebook. Of course for the rest of us there is a
| huge incentive. But it raises the question of how. How is it
| going to work, and how are you going to convince people to use
| it. The first part has lots of answers in the form of social
| networks without a lot of users. So, the latter part is the
| problem.
|
| As for authenticity and integrity; some notion of using
| cryptographic signatures could work. It's not particularly
| hard. News especially should only be coming from authentic
| sources. It's such a low tech solution to just sign your work
| and stake your ruputation. But somehow that's not a thing.
| Instead, our social media feeds are full of crap from dubious
| sources. Because click bait works and sells clicks.
|
| That points to a solution. The likes of Facebook should start
| authenticating sources of information and start accounting for
| reputability. They are obviously incentivized to instead serve
| you clickbait. So the solution is to incentivize them
| otherwise. Hold them accountable. They help spread
| misinformation and profit from it. That could have
| consequences. When it does, Facebook will adapt.
| blooalien wrote:
| https://keybase.io/ had a pretty neat thing goin' where you
| basically claim your online identity to the world at large
| through publicly available cryptographic signing proofs that
| anyone can easily verify through their website or desktop
| app. If sources of news or science or other important public
| information would cryptographically prove their information
| came from them in a way that was easily understood by the
| general public then it'd hella harder to spread
| disinformation without it bein' tied right back to it's
| source or instantly dismissable due to lack of provable
| origin details.
| cvwright wrote:
| IMO there are two critical components to fixing the social
| media business model:
|
| 1. The platform/provider/server needs to work for the users,
| not for some shady third party like the advertisers. The most
| obvious way to do this is to have the users pay for the
| resources that they consume. There are alternatives, too --
| users could band together to form cooperatives, or some
| benevolent foundation could kick in a lot of funding. (This
| last one is the current model for Signal; I guess we'll see how
| well it lasts...)
|
| 2. Key to making #1 work is that all the content needs to be
| end-to-end encrypted. Otherwise the platform will be too
| tempted to _also_ start doing targeted advertising, in addition
| to charging for access.
|
| Neither of these is all that difficult. We have all the
| technology right now. Mostly we just need to get people
| together and do it.
| ipaddr wrote:
| Who do you think is going to pay for social media?
|
| Are they still going to pay when most people leave the
| service because they won't pay?
|
| What regular person would rather pay then see ads? If you
| gave people tvs they would put up with the ads. If you give
| people free dialup internet they would put a banner across
| their browser.
|
| Paying is a non-starter.
|
| End to end encryped what? Posts/videos? Who cares..
| nradov wrote:
| End to end encryption is great for messaging. I fail to see
| the point of encryption for sharing social media posts with
| friends.
| dredmorbius wrote:
| Item 6 in the article specifically addresses business models.
| Scott Galloway advocates for subscription-based models (I feel
| he's misguided). And the question was the subject of an earlier
| article on the summit, mentioned and linked within the article:
|
| https://mitsloan.mit.edu/ideas-made-to-matter/case-new-socia...
| GuB-42 wrote:
| > I'm all for Pinterest and I think it's relatively healthier
| than the rest
|
| I can't think of Pinterest as more than pollution of Google
| image search. They may be good, but the first impression is so
| bad that I don't want to go further.
| chr1 wrote:
| One possible business model is to replace most of politicians
| the way Uber have replaced most of taxi company owners.
| Politicians are supposed to translate needs of people who elect
| them to bureaucrats who are supposed to bring that to reality,
| but they do their job very inefficiently wasting a huge amount
| of money.
|
| If there was a combination of facebook, change.org and
| https://voteflux.org, allowing people to directly vote about
| issues they care, trade votes, propose laws, create local
| communities with custom laws, and in general control the way
| government spends taxes, there will be enough money saved to
| allow the companies providing this service to be richer than
| facebook without using any shady tricks.
| nradov wrote:
| Your proposal would require another violent revolution to
| implement.
| wussboy wrote:
| Allowing people to vote on issues they care about would be a
| disaster. We'd have the death penalty back in a week, and
| abortion/immigration/drugs outlawed in a month.
|
| Democracy doesn't work because the masses are smart and get
| to have their say. They work because democracies are an
| agreement that we can do anything in our power to change the
| government EXCEPT violence. Making the government more
| responsive to the population will result in worse government,
| not better.
|
| I'm not advocating for totalitarianism. I'm just cautioning
| against thinking that it's the wisdom of the people that
| makes good democracy.
| blooalien wrote:
| When the people making the decisions actually know about
| the things they're deciding on, democracy can work great.
| When the decisions are made by people who know _nothing_
| about the topic but are totally convinced they know
| _everything_ about it no matter how completely wrong they
| may be and no matter how many provable facts you face them
| with, democracy fails utterly.
| chr1 wrote:
| What voteflux proposes is not simply direct democracy where
| opinion of the people who don't care enough to not go to
| ballot box is simply discarded. Votes still can be
| delegated, by setting your vote to follow vote of the
| person you trust, changes require significant majority, not
| simply 50%+1, people can trade their votes on different
| issues to reach consensus, major changes can't be accepted
| in a week if there is significant opposition, and most
| importantly different regions will be able to choose
| different laws.
|
| Now issues like abortion/immigration/drugs are used like
| carrot by politicians to herd voters one way or the other,
| and politician elected to as a result of support of one
| issue ends up voting on lots of other issues in a way that
| absolute majority of voters do not like.
|
| Voteflux being a marketplace of votes, can allow people
| with different beliefs to find compromises with each other,
| will allow politicians to understand what people actually
| want instead of guessing based on number of angry
| letters/tweets, and will allow people to organize in a way
| to have different laws in different places allowing society
| to experiment with different things instead of fighting
| life and death battle each election.
| wussboy wrote:
| Just to add to my other post, I like what you're getting at.
| I think getting money out of politics would make a huge
| difference. Let's make a voluntary and auditable pledge that
| politicians could make before they are elected where they
| promise never to entertain lobbyists. Politicians could take
| the pledge to win votes in tight election races.
| blooalien wrote:
| Hahahahah! Pure freakin' _GENIUS_ this is! _Needs_ to happen.
| We 'll just replace politicians outright.
|
| -------------------------------------------------------------
| -------------------
|
| So, I'm totally okay with this comment gettin' downvoted, but
| can ya at least like ... I dunno ... add to the conversation
| maybe?
|
| Perhaps a small comment on _why_ his idea _isn 't_ pure
| genius? Or why I shouldn't find it amusing? Or how about why
| replacing politicians with a better system could somehow be a
| bad thing? Downvote me if you like, but downvotes without
| context don't help _me_ (or anyone else) understand anything
| you might be trying to get across.
|
| -------------------------------------------------------------
| -------------------
|
| Just for fun, I'll propose another idea; Let's start
| _drafting_ people into political office.
|
| "Congratulations {random citizen}! You've just been elected
| President!"
|
| And just like a regular wartime military draft, refusal to do
| the job to the best of your ability = jail time.
| acidburnNSA wrote:
| Maybe a foundation could fund a nonprofit to do it better? I'd
| trust that more.
|
| Signal would be a good example in messenging space.
| agustif wrote:
| I was thinking about this the other day, a decentralized social
| network where it's own users govern and moderate it sounds like
| a plausible way to make this...
|
| dApps still on it's infancy, but it might come... Not sure if
| anyone will use it though, like the rest of the blockchains,
| lol
| isbvhodnvemrwvn wrote:
| What makes you think that user-moderated content won't be as
| toxic as algo-driven one? Look at reddit for example - the
| vast majority of content out there is posted to cause
| outrage, and people parrot it across the rest of the site.
| trompetenaccoun wrote:
| Reddit isn't decentralized. The "moderators" play enforcers
| for the admins, to push their agenda and cleanse the site
| of any content the management doesn't want to see. Outrage
| is their business model, it leads to increased interactions
| between users (i.e. flame wars) and so they spend more time
| on the site. Any community or mod that doesn't play along
| is removed, thus in the major subs you can think of them as
| directly working for the company, although unpaid. The
| users are NOT in charge, there is a crazy amount of
| political censorship on reddit. In the end that leaves the
| echo chamber it is now. Your example demonstrates how the
| authoritarian style of managing social media suppresses
| free speech but does nothing to reign in toxicity. Facebook
| and Twitter are similar toxic echo chambers.
|
| On an actually decentralized platform the radicals would
| still be there, but so would many other voices. Where users
| can't be banned for "wrongthink", communities must convince
| their audience with arguments or fear losing supporters.
| The chilling effect would be gone. A truly decentralized
| site would have plenty of communities the individual user
| doesn't agree with - just like in real life. And just like
| in real life users can deal with it by staying away.
| belltaco wrote:
| >Any community or mod that doesn't play along is removed
|
| This makes no sense and is completely wrong.
|
| Can you give some examples?
| willhslade wrote:
| Fat people hate and the Donald both fit his narrative.
| morpheos137 wrote:
| Removing the_donald sub was ridiculous especially when
| compared to equally if not more objectionable rhetoric on
| main subs like /r/politics. Additionally if you want to
| get rival subs banned on reddit you just post and report
| objectionable material icognito. Reddit is a dumpster
| fire of juvenile , woke idiocy now and I rarelt visit
| anymore. That said some of the content on the_donald was
| just as idiotic but if you are going to allow one you
| have to allow the other.
| [deleted]
| phone8675309 wrote:
| > a decentralized social network where it's own users govern
| and moderate it
|
| Were you ever on Usenet?
| agustif wrote:
| I think I recall downloading some stuff from there in the
| early 2000, yeah!
|
| Usenet was rad!
| dredmorbius wrote:
| Are you asking because you see Usenet as a solution or as a
| another nonviable approach?
|
| Usenet failed to scale past the first million or so users.
| Present social media platforms number in the 100s of
| millions to billions.
| phone8675309 wrote:
| I ask because Usenet, before it became a distributed
| lossy datastore for media pirates, was "a decentralized
| social network where it's own users govern and moderate
| it", and it was a complete, well documented, shitshow.
| dredmorbius wrote:
| Fair enough. I agree on both points.
|
| I've argued for several other reasons _in addition_ for
| which Usenet died:
|
| https://old.reddit.com/r/dredmorbius/comments/3c3xyu/why_
| use...
| partomniscient wrote:
| _> a decentralized social network where it's own users govern
| and moderate it sounds like a plausible way to make this..._
|
| That's what the entire internet was like in its early days.
| abecedarius wrote:
| The big potential difference is that the early
| decentralized systems like Usenet and Relay treated costs
| and spam and paying their developers as out-of-band
| concerns. Whatever else you think of the newer stuff around
| blockchains, etc., it changes the game in just this way. If
| you want an online world not under the thumbs of
| Google/Facebook/Twitter/etc., you need a thesis about how
| the same takeover won't happen.
| agustif wrote:
| I actually think the future of the internet might look like
| this but with micro-payment-transactions enabled by
| blockchain everywhere...
|
| Ads killed the web
| partomniscient wrote:
| When you consider business is ultimately about making money in
| which something has to give/lose out, and social connectivity
| is about hanging out with people you like - you realise these
| things don't go together and monetising socialisation is what
| dooms (what used to be) a good experience in the end.
| Grimm1 wrote:
| You can monetize socialization we've done it forever, clubs
| come to mind for instance. No one wants to pay directly for
| social media unlike in person experiences though so we have
| what we have.
| Const-me wrote:
| I paid for livejournal.com social media for a few years,
| then Russians bought the company and I cancelled.
| skinkestek wrote:
| > No one wants to pay directly for social media unlike in
| person experiences though so we have what we have.
|
| For the longest time I paid for MeWe, hoping they'd
| introduce pseudonyms and become a non Google Google+.
|
| I would actually have paid for Google+ too if they had kept
| it around.
| Grimm1 wrote:
| Why did you stop paying for MeWe? Just the lack of
| pseudonym introduction?
| pharke wrote:
| No one will pay to hang out in an empty room just because
| other people are there. They pay to go to clubs and bars
| because they provide entertainment, food, drinks, and
| atmosphere to go along with the social experience. If you
| created a rich enough social _media_ experience then people
| would be fine paying for it. It 's not surprising people
| don't want to pay for social media as it currently exists
| since it parasitizes all of its content from the web which
| is for the most part freely available. It's like fencing
| off a corner of a public park and trying to charge
| admission to enjoy the fresh air and sunshine.
| merpnderp wrote:
| People definitely used to pay for online socialization
| they'd just prefer free with advertisements.
| [deleted]
| partomniscient wrote:
| That's because monetisation via serving food/alcohol sales
| (in moderation) can enhance the socialisation experience,
| so its a win-win.
|
| Because big media got hooked on advertising revenue and
| news became subsidised and we got news/information cheaply,
| we'd been collectively taught that fact generation and
| dispersion costs were neglible (although they're not, work
| and energy is required and this was offset by the ad.
| revenue).
|
| Move this to the digital space where information flows at
| close to light speed and you can accurately measure all
| sorts of marketing stuff - it just goes faster to the point
| of absurdity. Whatever value we can create via connectivity
| gets trashed by being subverted or obverted by corporate
| interests.
|
| I'm now pissed off because I pay youtube $x per month not
| to be interrupted by ads, and now the content creators have
| their own ads in their content.
|
| When you scale out, it becomes a sad picture of humanity
| that is excitedly self-reinforcing monetisation of things
| via interruption of engagement/continuity.
|
| Actual value is evapourating.
|
| No wonder we're all getting ADHD.
|
| The spaces that are mostly simple and left alone are where
| interesting things can still continually occur. That's
| precisely why HN is actually still going strong.
| Grimm1 wrote:
| I agree with you, I find size is a predictor of decline
| in quality. HN hasn't hit that threshold, but even still
| I know that with what growth it has had the moderation
| efforts to keep it this way also increased significantly.
| ItsMonkk wrote:
| It's Dunbar's Number.
|
| When a tribe gets to large, because of Metcalfe's law the
| tribe loses the ability to discuss things to a level of
| nuance that is required, and that causes fractures that
| lead to a tribal split.
|
| Because all of the internet communities have no natural
| way to split, the only thing left is for them to become
| toxic and eventually die.
|
| Systems attempt to keep the tribe going longer by
| attempting to keep track of metrics, and this works for a
| while because the tribe had built up trust, but the
| bigger the tribe gets(with new members never trusting,
| forming Eternal September), the more the metrics reign,
| Goodhart's law states the more inevitable it is that it
| collapses.
|
| > ... there seems to be only one cause behind all forms
| of social misery: bigness. Oversimplified as this may
| seem, we shall find the idea more easily acceptable if we
| consider that bigness, or oversize, is really much more
| than just a social problem. It appears to be the one and
| only problem permeating all creation. Whenever something
| is wrong, something is too big. ... And if the body of a
| people becomes diseased with the fever of aggression,
| brutality, collectivism, or massive idiocy, it is not
| because it has fallen victim to bad leadership or mental
| derangement. It is because human beings, so charming as
| individuals or in small aggregations, have been welded
| into overconcentrated social units. - Leopold Kohr, 1957
| amelius wrote:
| That's because clubs do not interfere with the things we
| say to one another. Also they are not monopolies, so they
| can't experiment freely without risking us going to another
| club.
| polynomial wrote:
| Well, yes and no. The music in clubs is often designed to
| be so loud as to interfere in conversation, on the
| premise that the less you are able to converse the more
| you will drink.
| dejj wrote:
| > monetising socialisation
|
| Cafes and restaurants handle this well. I think it is because
| they "commoditize their complement"[1], i.e. you're allowed
| to talk while you consume. (Not too loud, though.)
|
| [1] https://www.gwern.net/Complement
| morpheos137 wrote:
| The current situation is like if 90% of bar or restaurant
| patrons went to one of three multinational chains all of
| which secretly recorded patron conversations in order to
| sell ads while giving away cheap food.
| ipaddr wrote:
| We would all go there for the free / cheap food if the
| place wasn't so crowded.
|
| How do you beat free / cheap food?
| chmod775 wrote:
| A restaurant exists explicitly to serve food first. If
| you're still there an hour after paying your bill, they'll
| start asking polite questions.
|
| A cafe, in most places, is a place for hanging out that
| makes money on the assumption that people will like
| refreshments, snacks or even small meals while doing so.
| They're typically not going to ask you to leave just
| because you haven't gotten anything in an hour or two
| (unless they're full). The exception appears to be some
| places where coffeehouses think they're a theme park ride.
| hsdjofk wrote:
| >When you consider business is ultimately about making money
| in which something has to give/lose out...
|
| This is the opposite of what I learned in Econ 101. People
| voluntarily transact because both parties stand to benefit -
| otherwise one party would choose to sit out.
| partomniscient wrote:
| That's because you're talking about value, whereas I was
| talking about money. You can also benefit timewise but lose
| monetarily and vice versa...hence 'time is money'.
|
| Money is meant to represent equivalence of value but it
| doesn't do this very precisely or well - which is why using
| it to 'measure' things causes so many problems.
|
| I get where you're coming from, but that's Econ 101
| idealism where you assume both parties are fully
| knowledgable about everything regarding the transaction -
| the real world is not like this. People also naively
| transact simply because they've been taught to do so.
| echelon wrote:
| Regulation.
|
| I think regulation is the answer that aligns with the business
| model. All businesses are forced to align in the same way, and
| no company can benefit from breaking the rules.
|
| If you fine companies for every day foreign bot accounts use
| the platform to spread misinformation, they'll staff up on
| countermeasures.
|
| If you pass laws against amplifying fake news and spreading
| negative sentiment, there will be machine learning investments
| like we've never seen before. Comments that have low confidence
| scores won't be shared broadly.
|
| If you regulate these companies like common carriers, they'll
| respect free speech on all sides of the aisle as long as it
| isn't law breaking.
| isbvhodnvemrwvn wrote:
| You assume that regulators and enforcers are free of bias, I
| have bad news for you...
| j16sdiz wrote:
| International regulations on social media would be
| interesting. I guess US and China won't agree on what is fake
| news.
| Const-me wrote:
| They don't have to agree. US-based Google seems OK serving
| customers from China under the terms of Chinese laws.
| nradov wrote:
| Most Google services are blocked in China.
| cvwright wrote:
| Federation is going to become necessary. Like with email,
| or with Matrix chat.
|
| Different jurisdictions are going to have laws so divergent
| that a single platform won't be allowed to operate globally
| anymore.
|
| So we'll have to have many different providers that can all
| talk to each other.
|
| Maybe this will be mandated by fiat. Or maybe it will just
| evolve naturally in response to incentives.
| bingidingi wrote:
| We should probably start to consider that business might be the
| problem.
| cvwright wrote:
| Yeah, what did businesses ever do for us?
|
| Besides CPUs that get faster and more efficient every year,
| phones that get smaller and last longer, global
| interconnected networks, cars that don't burn fossil fuels,
| ...
| xg15 wrote:
| > _" bc a "fixed social media model" almost seems antithetical
| to "a good business."_
|
| Or putting it the other way around: The only way to make money
| with social media is by compromising your users' mental health.
| morpheos137 wrote:
| One of the core problems with "social media" in 2021 is the
| default assumption that it should be monetised and a business.
|
| There is a difference between providing infrastructure for
| social connection and farming/harvestings/leveraging the
| information gathered from what is constructed on that
| infrastructure.
|
| The current situation is akin to if AT&T had listened in on all
| phone conversations (assuming no technical barriers) and sold
| ads based on what you said to friends and family. Before you
| made a call you would need to listen to a personalised ad on
| your reciever.
|
| Before facebook or myspace or twitter there were blogs/personal
| websites, forums, email and instant messaging. All these
| options worked for expressing one's self online and none of
| them were under a centralised, for-profit entity.
|
| Wikipedia is a good example of a decentralised social network
| although it appears to be increasingly infected with the
| reddit/twitter virus.
| jefftk wrote:
| What do you mean that Wikipedia is infected?
| morpheos137 wrote:
| Originally wikipedia did not condone advocacy or propaganda
| now it does as long as it can be attributed as a "fact"
| because it has been published by a "reliable source." Who
| decides what is a reliable source? The same sort of clique
| that moderates /r/worldnews or /r/politics. Reddit is an
| Orwellian cancer.
| jefftk wrote:
| Wikipedia's policies on this are https://en.wikipedia.org
| /wiki/Wikipedia:Neutral_point_of_vie... and
| https://en.wikipedia.org/wiki/Wikipedia:Verifiability
|
| These are the same policies Wikipedia has had for over a
| decade, with only relatively minor changes. Are you
| saying they're being interpreted differently? Could you
| give an example?
| LegitShady wrote:
| practice is how policy is applied, and wikipedia in
| practice is not the same as wikipedia policy by rote.
| jefftk wrote:
| Can you give an example of how you see this changing?
| morpheos137 wrote:
| For one thing all media sources should be on equal
| footing. Let the community decide what is bullshit. It is
| ridiculous that BBC is considered a reliable source but
| RT is not. Both have a specific point of view that they
| promote, reliably, Wikipedia is not supposed to have a
| point of view. If BBC is allowed but RT is banned then
| Wikipedia POV will tend toward BBC simply because a
| contrasting POV is censored. Same would hold for allowing
| an Israeli newspaper but banning an Iranian one as an
| unacceptable source. All media outlets should be on equal
| footing.
| zenzen wrote:
| RT is state propaganda. Not an accurate equivalence at
| all.
| FreeSpeech wrote:
| Just as Al Jazeera is Qatari state propaganda, BBC is
| British state propaganda. You can pull the wool over your
| eyes and disagree, but they're all state backed narrative
| machines.
|
| The BBC works directly with the UK Foreign Office to push
| propaganda to destabilise other countries:
| https://thegrayzone.com/2021/02/20/reuters-bbc-uk-
| foreign-of...
| LegitShady wrote:
| what do you think BBC is? or CBC? Also state funded
| 'journalists' who push agendas. You just like their
| agenda better.
|
| at least US intelligence agencies used to hide behind
| private news sources until they started hiring the
| intelligence people outright.
| rwj wrote:
| The CBC has a viewpoint, but it is definitely not state
| propaganda.
| xg15 wrote:
| Bellingcat and Radio Free Europe are considered reliable
| sources, while Wikileaks is not...
| FreeSpeech wrote:
| Bellingcat participated in a covert UK foreign-office
| funded program with the BBC to "weaken Russian":
| https://thegrayzone.com/2021/02/20/reuters-bbc-uk-
| foreign-of...
|
| Anyone suggesting Bellingcat is more reliable than
| Wikileaks is regurgitating establishment talking points.
| xg15 wrote:
| See here: https://en.m.wikipedia.org/wiki/Wikipedia:Relia
| ble_sources/P...
| FreeSpeech wrote:
| Exactly. Perfectly demonstrates Wikipedia's western-
| centric, left-leaning tilt.
| morpheos137 wrote:
| Bellingcat is obviously an MI6 PR outlet. To believe
| otherwise is ludicrous. Some stay at home dad who used to
| sell women's underwear has an independent insight on
| world class covert activities? Give me a break!
| temp8964 wrote:
| Exactly. Policy is implemented by people. Eventually
| those who put extraordinary amount of time online will
| takeover editor/moderator positions and win editor wars.
| Most of those people are definitely not real experts in
| the field. Because real world experts wouldn't have time
| / motivation to engage in online fight.
| Pilfer wrote:
| Larry Sanger, who co-founded Wikipedia, provides many
| examples of how Wikipedia articles are heavily biased
| towards mainstream viewpoints.
| https://larrysanger.org/2021/06/wikipedia-is-more-one-
| sided-...
|
| > _with only relatively minor changes_
|
| Relatively minor changes have huge effect on the content
| of articles. The linked article also addresses how
| Wikipedia has _banned_ conservative sources from
| Wikipedia including Fox News, the Daily Mail, and the New
| York Post. "In short, and with few exceptions, only
| globalist, progressive mainstream sources--and sources
| friendly to globalist progressivism--are permitted."
|
| Every claim in a Wikipedia article must be accompanied
| with a source. Claims that are _only_ covered by
| conservative media and not covered at all by mainstream
| (liberal) media, _cannot_ be referenced as a source in a
| Wikipedia article. This leads to conservative viewpoints
| being removed from articles. Which directly causes
| articles to become biased towards mainstream viewpoints.
| shadowgovt wrote:
| In other words, facts that can only be cited to
| unreliable sources are not allowed in Wikipedia.
|
| That checks out.
| Pilfer wrote:
| That is not at all what I, nor the author, is claiming,
| and instead of retorting with a shallow dismissal I
| recommend you review the HN guidelines.
| CityOfThrowaway wrote:
| Blogs, personal websites, forums, email, and IM all still
| exist and are still large platforms.
|
| It's clear based on user behavior that users _like_ the
| experience that centralized, for-profit social media networks
| provide. That experience invariably is very expensive to
| deliver... Dramatically more expensive than serving
| Wikipedia, for example, and the costs grow faster as the
| number of users increase.
|
| So, in order to deliver the user experience that real people
| have demonstrated their preference for, one of the key inputs
| is exceptionally large (and increasing) amounts of money.
|
| In order to be sustainable, that money has to come from
| somewhere. Targeted ads aren't the only option, but no viable
| competitor has been discovered.
|
| The reason why targeted ads are so productive here is because
| the value of the network grows non-linearly (up to a point)
| with each additional user. If you charge an entrance fee to
| each user, your network will be smaller and therefore
| outrageously less valuable than any network that does not
| charge an entrance fee.
|
| Therefore, so long as any social network is willing to make
| their network free to enter, effectively no network with a
| comparable service can charge for usage.
|
| Importantly, not only will the free social network be more
| valuable than the paid social network in relative terms, it
| will also be more valuable in absolute terms - both in terms
| of value received per user and number of users receiving
| value. So, it may be possible to regulate free social
| networks out of existence, but doing so would necessarily
| destroy a lot of user value in the process.
|
| Perhaps there is some type of patronage or donation-based
| model that could both enable the free entry price and avoid
| the pitfalls of the ads-based business model. However, an
| organization like this is unlikely to attract the technical
| talent necessary to build competitive products.
|
| Of course, like both of us mentioned, other models and
| competitive ecosystems exist. Centralized social networks
| have not killed email or instant messaging or blogs or
| websites. Those things are, by every measure, actually way
| bigger than they ever have been. So in a sense, the thing
| you're hoping for is already here, it's just not the dominant
| modality chosen by other people.
|
| There are fair questions to ask about the power that
| ownership and control of a centralized social network imbues.
| Though, I think those are orthogonal concerns to the question
| of funding mechanisms.
|
| In sum, I think the world you've expressed desire for does,
| indeed, already exist. It happens to co-exist with the world
| that you have expressed distaste for. While I am empathetic
| to your distaste for social media's influence and business
| model, I think it's important to recognize both why it exists
| and the ways in which it's existence is not strictly zero
| sum.
| morpheos137 wrote:
| I wonder if it is more that users are addicted to twitter
| or facebook than they like the services. Or if it is more a
| thing of everbody else is on there.
|
| I am confident if a free simple social network had caught
| on among college users about the time wikipedia started
| then facebook or myspace never would have been commercially
| viable.
|
| Almost all users prefer no ads and privacy to ads and no
| privacy.
|
| I disagree that that experience needs to be expensive to
| deliver. Imagine a decentralised social media network
| hosted on its users compute devices, kind of like email
| used to be. Aside from large streaming videos not that much
| processing power or storage is required per user.
| nradov wrote:
| How would a free simple social network be sustained? Who
| would pay for the software, hardware, bandwidth, and
| content moderation? Asking for donations and volunteers
| seems unlikely to scale to the level required.
| neffy wrote:
| >It's clear based on user behavior that users like the
| experience that centralized, for-profit social media
| networks provide.
|
| I don't think this is a given, and it's not that expensive
| to deliver either - the problem is more with the de facto
| monopoly of network position. But there are lags on all
| these kinds of activities, and user behaviour today, is
| conditioned on user experience yesterday, it may not
| predict well user behaviour in the future. Social media is
| still new, the experience of having it delivered to a
| mobile phone barely 10 years out of the box for most of the
| world.
|
| Consider a marketing team that predicted the future of the
| book by looking only at input from the reading habits of a
| kindergarten, and ignoring any correlation with how those
| would change as the kids got older. Note that none of this
| has anything to do with payment per se, and neither does
| today's social "advertising enabled" media.
| cvwright wrote:
| I don't think the monetization and business aspect is what
| makes social media so bad. It's more that they're going about
| it in the wrong way.
|
| You make the comparison to AT&T. They charge a lot of money
| for the service that they provide, and as you mention, they
| don't do all the shady things that social platforms do.
|
| Most other great tech products are provided by businesses.
| Apple is the prime example of a company that is both (a)
| massively successful and (b) not hostile to their users'
| privacy.
| egypturnash wrote:
| I very much agree with this.
|
| I run a Mastodon instance for myself and some friends. It's
| got about 250 accounts total. The hosting costs are about
| evenly split between myself and friends willing to throw me
| some money via Patreon. It is not making me money and I do
| not _want_ it to make me money because then it would probably
| require a lot of my time to deal with moderating it. I don 't
| want to have to do that or hire anyone to do it.
|
| I run it to have a space that's free from the bullshit
| corporate social media does in the name of "increasing
| engagement" to make more profit off of ad views. I _like_
| that I can visit my instance, catch up on the local timeline
| and the stuff I 'm personally following, and then be _done_
| with it for most of the day, versus opening up a corporate
| site that 's had tons of very smart people sink a lot of time
| and effort into figuring out how to keep me there for hours
| on end, whether or not I'm better off for it.
| jasode wrote:
| _> there were blogs/personal websites, forums, email and
| instant messaging._
|
| If you're listing those as Facebook alternatives, you
| misunderstand the key ingredient that made Facebook get
| adopted _by billions of non-technical mainstream users_ : the
| real name identities.
|
| The other communications platforms of dial-up BBSs, USENET,
| Geocities, personal Wordpress, vBulletin/phpBB forums, AOL
| AIM ... do not have strong _real-name identities_ for non-
| tech people to _find_ each other. Anonymous /pseudonymous
| online handles and cryptic login names have too much friction
| of discovery and are _not scalable to a billion users_.
| (Previous comment about the real Rolodex acting as database
| primary key: https://news.ycombinator.com/item?id=15294086)
|
| Facebook (possibly accidentally) bootstrapped the psychology
| of users _creating logins with their real name_ in 2004
| because Harvard students want to find each other on the
| service. In turn, as other schools were added on, other
| students were motivated to use their _real names_ to interact
| with friends in other schools. The _social graph of real
| names_ becomes so viable that old high school alumni that
| were "lost" can now find each other again and grandparents
| can easily now keep track of grandchildren's baby photos. The
| fake names of USENET and other forums prevent that from
| happening.
|
| Emails don't work for social network growth because you can't
| _derive_ an email address from a real name you know. Someone
| has to tell you the exact spelling of their email address.
| With Facebook, you can just find them by their real name (or
| their real phone number) and send a "friend request".
|
| _> All these options worked for expressing one's self online
| _
|
| The secret sauce to Facebook's dominance is its accumulation
| of real identities and not because it gives Facebook users a
| way to "express themselves". E.g. A lot of grandparents
| following grandchildren photos don't post anything.
|
| Consider direction of cause & effect:
|
| - easier evolution of platform: a bunch of users login with
| real names to follow/poke each other --> easy to build on
| that network into groups for discussion, shared calendar
| events, share news articles, etc
|
| - harder to grow platform: start with discussion forum
| software with fake names --> impossible to build a giant
| sticky social network graph from that base
|
| Facebook & Linkedin built on real names. WhatsApp growth
| built on real phone numbers.
|
| Platforms built on real identities will naturally centralize
| because nobody has come up with a viable decentralized
| alternative for a universal identity database. Ideas like
| web-of-trust, blockchain, keybase.io, etc don't accomplish
| the same thing as Facebook's real_id database.
|
| [To downvoters, if you have a better explanation of how
| Facebook got mass adoption by non-tech people in ways that
| didn't happen with USENET and vBulletin web discussion
| forums, please reply with your thoughts.]
| pharke wrote:
| So why doesn't someone simple replace Facebook with an
| "internet phonebook" that allows you to easily
| lookup/publish links to personal blogs, email, forum
| accounts, IM, etc.?
|
| Finding people you know on Facebook is only one aspect of
| their success and it's probably the most heavily network
| dependant one. The other thing that drove adoption was the
| relative simplicity of publishing content. Your Facebook
| profile was/is essentially a dumbed down webpage where you
| can post text, links and photos. You also get free hosting,
| free "email" in the form of messaging, and a bag of other
| features that you would normally have to set up yourself
| and pay for. They did a good enough job of providing all of
| the killer features of the internet for free to average,
| non-technical users.
| jasode wrote:
| _> So why doesn't someone simple replace Facebook with an
| "internet phonebook" that allows you to easily lookup_
|
| Because it's hard to get people to enter that
| information. In contrast, Harvard students in 2004 were
| highly motivated and that started everything off.
|
| From 2005 to 2008, the killer feature of Facebook was
| entering your real information when signing up and then
| Facebook _showing all the real people in life that were
| connected to you_. E.g... old high school friends finding
| each other again. Likewise, signing up for Facebook means
| you also wanted to "be found" by others. It was the
| first mainstream service to successfully accomplish this.
| Myspace didn't do it. Friendster tried but their webpage
| performance was too slow.
|
| _> The other thing that drove adoption was the relative
| simplicity of publishing content. _
|
| Most Facebook users do not "publish content". Most just
| follow their friends or read links passed on by friends
| (or the algorithm). [EDIT for clarity: I mean "publish
| content" the way that gp I replied to cited _"
| blogs/personal websites"_]
| nradov wrote:
| I don't know the overall statistics but the majority of
| my Facebook friends publish content at least
| occasionally. Mainly photos of children, pets, vacations,
| etc.
| pharke wrote:
| When I used Facebook back in the day, I joined because
| people I already knew and talked to on a regular basis
| were on there. We mostly posted photos and organized
| events. I didn't use it to find people I used to know so
| I guess my experience was different.
|
| By publish content I mean post links, photos, comments,
| rants, etc. Do people no longer do that on Facebook?
| ipaddr wrote:
| Real names is not the story.
|
| It's not like facebook required one in the beginning
| either. You could signup with joe s and many of these
| profiles still exist.
|
| The secret was the rollout (the school network connected to
| your edu email), and other tricks like getting you to give
| your login/password so hotmail contacts could be
| downloaded.
|
| The journey to the top has a lot of little wins. Real names
| were a step added much later and connected with their ad
| network strategy.
| jasode wrote:
| _> Real names is not the story. [...] The journey to the
| top has a lot of little wins. _
|
| I understand but my comment wasn't trying to explain
| Facebook _competitive market value_ by reducing it to
| only to real names. To beat Friendster, beat Google+, and
| get to its successful IPO, one can point to _many things_
| ... e.g. the addictive NewsFeed rollout in September
| 2006... and free unlimited disk space for photos, photo
| tagging of friends, etc.
|
| My comment to gp's quote was how Facebook as a _social
| network_ works very differently from USENET
| /Geocities/vBulletin/AOL_AIM. One can't point to those
| older communications alternatives and say they work "just
| as well". On those platforms, there's a fundamental
| difference that prevents _people from even finding each
| other_.
| altdataseller wrote:
| "they barely have a functional business..."
|
| They made almost $2 billion in the last 12 months. How is that
| not thriving?
| jayspell wrote:
| "There's always been this division between your right to speak
| and your right to have a megaphone that reaches hundreds of
| millions of people," - that is the argument totalitarians always
| use. It's the same as you have the right to free speech, but not
| the right to print it, or you can print it, but you can't
| distribute it on the street corner, or talk about it in an
| assembly of over a certain number of people. In every instance
| it's about whomever has control attempting to limit the
| expression of an idea that they happen not to agree with.
| Totalitarians will always use an example of where speech somehow
| created harm to exhibit the need for control, the truth is that
| freedom does cause harm, and will always cause harm, because you
| cannot eliminate harm completely no matter the system. We have
| plenty of historical examples of the attempts to eliminate harm
| causing exactly the opposite.
| enraged_camel wrote:
| >> It's the same as you have the right to free speech, but not
| the right to print it, or you can print it, but you can't
| distribute it on the street corner, or talk about it in an
| assembly of over a certain number of people.
|
| No printing press is obligated to mass-print a book you write.
| Similarly, you cannot go to a corporation's campus and
| distribute stuff unless you have their permission -- because it
| is their private property. Just because private parties are
| refusing to give you a platform does not mean they are
| totalitarians.
|
| >> Totalitarians will always use an example of where speech
| somehow created harm to exhibit the need for control, the truth
| is that freedom does cause harm, and will always cause harm,
| because you cannot eliminate harm completely no matter the
| system.
|
| This is a strawman in the shape of an Argument from Futility.
| It's no different than saying "why have safety systems in cars
| when you cannot eliminate car accident deaths completely?" It's
| a strawman because nobody is aiming to completely eliminate
| harm from online speech, just prevent it in egregious
| situations where it can be prevented. It's an argument from
| futility because harm reduction is still a valid goal even if
| complete elimination of harm is not possible or feasible.
| gizmondo wrote:
| > No printing press is obligated to mass-print a book you
| write.
|
| This is off-topic, I feel. The article illustrates that there
| is a push to stop "printing press" from printing "your book"
| in cases when it's perfectly happy to do that. In the name of
| fighting misinformation, of course.
| AnimalMuppet wrote:
| You have the right to free speech, the right to print it (at
| Kinkos, even if a regular printer won't touch it), the right to
| distribute it on a street corner, but not the right to force
| people to take it, and not the right to force a bookstore to
| sell it.
| AbrahamParangi wrote:
| I think it is probably not possible to have both "true
| information" and "control of information" simultaneously.
|
| I suspect that this is a false choice, akin to "Those who would
| give up essential Liberty, to purchase a little temporary Safety,
| deserve neither Liberty nor Safety".
|
| If we may observe history, we see that societies where one is not
| free to speak do not have more "true beliefs", and indeed believe
| many things we think of as nuts.
| tzs wrote:
| History trivia: the "essential Liberty" Franklin was talking
| about was not any personal or individual liberty. It was the
| liberty of the government to tax property. The "purchase" he
| was talking about was literally buying things with money.
|
| The Pennsylvania Assembly wanted to raise money to defend the
| frontier during the French and Indian war by taxing land. The
| Governor kept vetoing this at the behest of the Penn family,
| which owned a lot of land that would be hit by any such tax,
| each time finding some objection that Franklin and the Assembly
| found made little sense.
|
| Franklin wrote to the governor arguing that the Assembly should
| not have to give up its liberty to exercise taxing power over
| the Penn family lands in order to raise money for defense.
| AbrahamParangi wrote:
| Yes, the common usage of the quote is more lyrical than
| Franklin's usage. I didn't quote him specifically because
| it's sort of a misquote used in his sense.
| shadowgovt wrote:
| Without the appeal to authority of citing Franklin though,
| it's not a quote that holds as much water when inspected on
| its own merits.
|
| Trading liberty for safety is the crux of social contract
| theory, so the absolutist interpretation of the quote is
| clearly devoid of worth. We can instead focus on
| "essential" and "temporary," and then perhaps it has value
| but we must then argue about the meaning of those words.
|
| ... Is the freedom to shout falsehoods with Facebook's
| megaphone without Facebook weighing in "essential?" How
| "temporary" is the safety we gain from Facebook tossing
| fact-checks on posts?
| helsinkiandrew wrote:
| One big problem that the report doesn't (I think) cover is that
| the current social media venues encourage competition - it's not
| a discussion and understanding amongst friends or family (or god
| forbid strangers), it's a competition for likes, follows, and
| reposts.
|
| Even HN suffers from this - but far less than other places. Poor
| replies to popular comments will receive more views, there are
| few detailed conversations, just people sharing their view and
| leaving, opinions from often very knowledgeable people can get
| lost in hundreds/thousands of other comments.
| diego wrote:
| It's broken from the point of view of the article writers. It's
| working just fine from the point of view of the companies
| operating it.
| ergot_vacation wrote:
| Bingo. Water flows downhill, systems tend toward entropy, and
| businesses do whatever makes the most money. If those actions
| (like child labor) are unacceptable to society, laws have to be
| made to restrain them, or they will keep happening.
|
| In general this reads like a sickening melange of half-true
| diagnoses (yes, of course lack of competition is a problem) and
| naive appeals to the nanny state that don't understand what
| they're inviting. To whit:
|
| "Renee Diresta, research manager at the Stanford Internet
| Observatory, said policy should also differentiate between free
| speech and free reach. The right to free speech doesn't extend
| to a right to have that speech amplified by algorithms."
|
| Yes citizen, you can say whatever you want. You just have to
| say it in this sealed, soundproof room with no windows! This is
| almost literally the author saying "You have the right to
| speak, you just don't have the right for anyone to hear you",
| which is nonsensical. Free speech is meaningless if you can't
| actually reach anyone with it. The right to yell your deepest
| beliefs in the middle of the Sahara desert isn't free speech.
|
| But of course, this ability to silence "misinformation" is
| fine, because it will only be used against the bad people!
| Surely it will never be used to silence people like the author,
| because they're good and right! Never mind that this is already
| happening (Youtube is famous for silencing certain LGBT and
| sex-positive figures, as well as any number of left-wing
| activists, for example).
| Dem_Boys wrote:
| This entirely misses the point IMO. Social media is the most
| powerful propaganda/idea pushing machine in the history of the
| world and it ain't even close.
|
| Lies, half truths, and everything in between are presented in the
| most riveting ways possible and our monkey brains just can't help
| but soak up every last bit of it. If a person believes everything
| on social media then __they__ should be fixed, not social media.
|
| A great solution is for people to see social media for what it
| is: 99% garbage that gives you a quick dopamine hit.
|
| People should be made aware that on social media you are likely:
| 1. Reading misinformation 2. Viewing content
| that's created with the sole purpose to manipulate you into a
| certain view (Often times by a bad actor) 3.
| Reading opinions by trusted people who've been compromised by the
| above two points
|
| While not perfect, I believe if people assumed everything on
| social media is false and work from there then people would be a
| lot happier.
| tobr wrote:
| > A great solution is for people to see social media for what
| it is
|
| Just like a great solution to the climate emergency is to stop
| releasing greenhouse gases, a great solution to war is peace,
| and a great solution to starvation is to eat food.
| Dem_Boys wrote:
| Your right! So simple but I say this because I've never seen
| a well funded campaign or organization spreading this
| message.
|
| We have large organizations with great solutions tackling
| climate change, starving children, etc....
|
| I guess maybe like war, spreading this message would hurt
| pockets that are far too deep to be challenged.
| tobr wrote:
| So your actual proposed solution is to defeat "most
| powerful propaganda/idea pushing machine in the history of
| the world" with a campaign.
| Dem_Boys wrote:
| I honestly have no bulletproof solution and I also have
| no desire to "defeat" social media.
|
| I just think that people would be happier and social
| media would be a better place if the public were educated
| on what it truly is.
| eric4smith wrote:
| Ummm
|
| Social media behavior and design is driven by the users in no
| small fashion.
|
| The big companies merely amplify the dominant behaviors.
|
| Remember the Instagram, Facebook and Twitter linear timelines???
|
| Yeah. I thought you did.
|
| The companies merely made it more attractive by amping up what's
| most popular or what we like.
|
| Social media cannot be fixed by big tech.
|
| It takes users to fix it.
|
| It's just a mirror for our basest instincts.
| jot wrote:
| The "25 ways" were buried in a linked PDF
| (https://www.yumpu.com/en/document/read/65717082/the-
| smsmit-r...).
|
| Here they are:
|
| 1) Hold platforms accountable for designs that amplify lies.
|
| 2) Focus on and shut down prolific disinformation networks.
|
| 3) Use content interventions to nudge people toward awareness of
| falsity and accuracy.
|
| 4) Use accuracy nudges to crowdsource falsity labels so that
| algorithms can be trained to automatically identify lies.
|
| 5) Give researchers access to more data, using technologies such
| as differential privacy and institutional mechanisms, like data
| safe harbors.
|
| 6) Offer platforms incentives, like legal immunity, to share more
| data with researchers.
|
| 7) Create independent panels to oversee research funding by
| platforms.
|
| 8) Require robust legislative remedies to anticompetitive
| platform practices.
|
| 9) Create collaborative global institutions where discussions
| about governance can take place.FTC or FCC--to vet compliance
| with guidelines.
|
| 10) Establish independent U.S. statutory bodies--not the FTC or
| FCC--to vet compliance with guidelines.
|
| 11) Offer platforms 'earned immunity' from civil liability, e.g.
| under Section 230, if they comply with content moderation
| regulations.
|
| 12) Break up big companies, starting with Facebook.
|
| 13) Legislate interoperability as well as social network and data
| portability.
|
| 14) Emulate EU proposals for stricter anti-competitive penalties
| to deter corporate misbehavior.
|
| 15) Create programmability with the major platforms to encourage
| innovation.
|
| 16) Diversify the engineering core to include social scientists,
| cognitive scientists, and people trained in different types of
| histories.
|
| 17) Create ethics certification programs and degrees for AI
| designers.
|
| 18) Embrace friction to reduce the automatic nature of
| information diffusion and interactions with AI.
|
| 19) Regulate platforms for product safety; hold designers and
| companies accountable for the products they design and deploy.
|
| 20) Develop new business models including subscription and
| freemium models.
|
| 21) Tie consequences of legal violations directly to corporate
| executives, not just their companies.
|
| 22) Strengthen regulations by adding taxes, such as programmatic
| media taxes, to deter algorithmic amplification.
|
| 23) Ensure that users consent to data use; protect data privacy.
|
| 24) Distinguish between speech and reach--the right to speech and
| the right to amplification of that speech.
|
| 25) Limit corporate lobbying; enforce stricter campaign finance
| rules.
| isbvhodnvemrwvn wrote:
| What do they define as lie and disinformation? And how is this
| mechanism going to remain impartial if it's going to be
| enforced by the government?
| brigandish wrote:
| You'd be surprised how easy it is to detect lies. Ask most
| people (and no doubt the authors of this are included) and
| they'll tell you "they're told by my political opponents and
| those who disagree with me".
| phone8675309 wrote:
| The following will never happen (or be corrupted at inception)
| because social media is a tool of the capital and political
| classes for spreading their propaganda:
|
| > 2) Focus on and shut down prolific disinformation networks
|
| > 4) Use accuracy nudges to crowdsource falsity labels so that
| algorithms can be trained to automatically identify lies.
|
| > 21) Tie consequences of legal violations directly to
| corporate executives, not just their companies.
|
| > 22) Strengthen regulations by adding taxes, such as
| programmatic media taxes, to deter algorithmic amplification.
|
| > 23) Ensure that users consent to data use; protect data
| privacy.
|
| > 24) Distinguish between speech and reach--the right to speech
| and the right to amplification of that speech.
|
| > 25) Limit corporate lobbying; enforce stricter campaign
| finance rules.
| [deleted]
| skinkestek wrote:
| The first 10 or so sounds extremely dangerous!
| ergot_vacation wrote:
| Two very different groups hate social media for two very
| different reasons.
|
| Ordinary internet users hate it because it represents a
| serious loss of autonomy for them, and a degradation of
| service. It makes the internet less free than it used to be.
|
| The wealthy and powerful (and those who love brown-nosing
| them) hate social media because it's STILL TOO FREE. There's
| way, way too much information there that threatens their
| vice-grip on the world, and it has to go. NOW.
| A4ET8a8uTh0 wrote:
| Very low on details and high on platitudes you read on various
| blogs. Since it is relatively high level, the only one I can
| kinda agree with is regulation. Right now it is still a wild west
| company town.
| high_byte wrote:
| I'm surprised the none of the things I want are mentioned. here's
| (some) of my ideas:
|
| 1. know who you talk to if you want to stay anonymous, use an
| anonymous platform. HOWEVER, that guy you entered a twitter
| shitstorm with? might as well be some 12 year old, broken
| english, russian troll. that chick who messaged you? 56 year old
| chinese gypsy.
|
| 2. mark content type sarcasm is (imo) very important. but in the
| wrong context, lack of context, it can be horribly confused.
|
| 3. exact timestamp I don't want to see a post is from "few months
| ago" (hidden, in a small text similar to the background color). I
| want posts to be color coded by age & type.
|
| 4. exact definitions my language isn't the same as your language.
| even if we both speak English, there are different meanings to
| different cultures, regions, etc. if something is ambiguous or
| unknown, I'd like to get a dictionary definition as reference.
|
| 5. I don't care about likes. or views. I want to see views/likes
| ratio. if a post has 1000 likes is it good? maybe, unless it has
| 1m impressions. or it's exceptional if it has 2000 impressions.
| that's a far better measure for quality
|
| and there's more if I just keep thinking about it. probably a lot
| more
| smegcicle wrote:
| 1: i can't imagine anyone trying harder at this than facebook
| is?
|
| 2: marking content as sarcasm completely undermines it's
| impact. 'sarcasm tags' are unutilized due to lack of purpose,
| not lack of visibility. how does facebook's 'react emoji'
| functionality not cover this usecase better?
|
| 3: you don't want the human-readable age, you want the exact
| absolute date, but also the age, which you want to be exposed
| as color..? absolute date is generally easily grabbed with js,
| but hidden to reduce clutter.
|
| 4: a product to put cross-culture communication up front would
| be an interesting prospect, but i've not seen anything like
| that, and i'd expect a full team, many revisions, and a patient
| and involved community to get functionality and ui approaching
| helpful...
|
| 5: you don't want the exact views and like, you want a human-
| readable measure for quality that you just came up with and can
| easily do in your head? (vs how you feel about predigested data
| re #3)
| jasfi wrote:
| There is no lack of competition, but due to the network effect
| (and probably other things), the largest social networks aren't
| going anywhere.
| CraigJPerry wrote:
| Used to always think just shining a big light was all that was
| needed to overcome falsehoods and manipulation but turns out
| that's not true. How naive of me. Turns out various philosophers
| realised this hundreds of years ago.
|
| Anyway, it doesn't matter, i can finally say I've found my people
| now, @thebirdsarentreal :-) (its a satirical take on belief
| systems that vehemently denies it's satirical which is just
| _chef's kiss_ )
| heresie-dabord wrote:
| > all that was needed to overcome falsehoods and manipulation
|
| All that is needed is honest discourse and a good educational
| system.
|
| But what twaddlevision did to US society in the pre-WWW period,
| social memia is now doing to subsequent generations.
|
| The circle is unbroken. ^_^
| nabla9 wrote:
| The reflexive satire and irony is one of the problems of
| social-media age.
|
| Satire and irony used to be effective. In last 20-years cynical
| distance and ironic posturing have become so prevalent that it
| is no no longer considered subversive. Citizens can consume
| outrage passively through various satirical media products,
| displacing outrage and abstaining from more active forms of
| resistance.
|
| People consume satirical content, and think they are above it
| all, while just participating in cynical ha-ha.
| mediocregopher wrote:
| Way back in the stumbleupon days I stumbled onto the flat earth
| society message boards. It was clear to me that everyone
| involved knew it was satire; just a tongue-in-cheek dig at the
| way any conspiracy theory, no matter how outrageous, could be
| made plausible if you throw enough crazy at it.
|
| Fast forward 10+ years and people actually believe it. Someone
| recently launched themselves in a homemade rocket to prove with
| their own eyes the flatness of the earth (and died for their
| efforts).
|
| I love birdsarentreal, but part of me isn't looking forward to
| finding out what happens to it as more people come across it.
| b3morales wrote:
| Indeed; Umberto Eco dramatized this phenomenon in _Foucault
| 's Pendulum_ (even before the related Poe's Law was
| formulated).
|
| > People [...] sense instinctively that the [...] truths
| [...] don't go together, that [the promoter is] not being
| logical, that [they're] not speaking in good faith. But
| they've been told that God is mysterious, unfathomable, so to
| them incoherence is the closest thing to God.
|
| This tangles up with the appeal of being an "initiate", "on
| the inside", and knowing the secret truth, and the
| combination is powerful.
| nvllsvm wrote:
| Exactly why I dislike some forms of sarcasm directed at an
| unknown audience.
| jcadam wrote:
| The dream of the internet as a global medium for the free
| exchange of ideas and information died a quick death.
| blooalien wrote:
| Funny thing is ... Hacker News discussions and a certain subset
| of channels on Telegram and Discord are probably among the most
| actively _actually social_ media I 've managed to find online.
| acidburnNSA wrote:
| Getting both more regulation and more competition usually is a
| tough ask. Complying with complex regulations often makes it
| harder to enter a space.
| moomin wrote:
| I think it's important to acknowledge that a) you're absolutely
| right and government contracts are a brilliant extreme example
| of this but also b) regulations are precisely the things that
| allow markets to exist at scale.
|
| We often talk about regulations in terms of volume, but the
| actual details matter.
| ericls wrote:
| Here's an easy one: just remove all numbers from social media.
| testplzignore wrote:
| Yup. When Facebook originally added likes and promoted posts
| based on metrics, that's when I feel things fell apart.
| ajmurmann wrote:
| Even if you don't show the numbers, won't more controversial
| content still be promoted more because the company behind the
| network still knows the numbers and wants to increase
| engagement because they need to sell advertisements?
| [deleted]
| newsclues wrote:
| Charge money to join.
|
| Keep the riff raff out and have users real identities linked to
| payments.
|
| I'd rejoin Twitter if it cost $5-10 per year
| phone8675309 wrote:
| > have users real identities linked to payments
|
| I will never use a social media platform that requires me to
| link my IRL identity. It's DoA for me.
| mordymoop wrote:
| Facebook is hardly DOA.
| phone8675309 wrote:
| I don't use Facebook. Edited the GP to clarify that any
| platform that requires tying my IRL ID to it is DoA for me.
| newsclues wrote:
| Just because it's doa for you, doesn't mean others will feel
| the same.
| fumblebee wrote:
| So, a social media site for the 0.1% of people in the world
| who'd be willing to pay for it.
|
| Count me out.
|
| Twitter isn't Twitter without the wide range of people it
| attracts, for better or worse.
| tomjen3 wrote:
| That would make it a ghost town.
|
| Social media requires that the people you are interested in are
| there for it to have any value. Not enough of my friends are
| going to be willing to join facebook for >0 usd, which means I
| wouldn't be willing to join it either.
|
| Twitter can maybe be saved here, but they would a lot smaller.
|
| Sadly 10 usd/year isn't enough to keep out the riff raff, in
| fact it would probably entice people to see it as an investment
| meaning we would get ever more crap from influencers, "thought"
| leaders, etc.
| XorNot wrote:
| And it would be promptly irrelevant because someone would just
| make "free twitter" and dominate the market again.
| exo-pla-net wrote:
| Something Awful forums did this. Toxic users are happy to get
| banned and then pay another $10 to keep being toxic.
| ergot_vacation wrote:
| The tactic was highly successful on SA, but it can only do so
| much. Having a cover charge keeps out lazy spammers,
| children, and casual vandals/trolls pretty well.
| Unfortunately, it doesn't do anything to keep out those who
| are determined to be disruptive and have anything resembling
| real income. Even with 5 bans a month, $50 is trivial to any
| adult with a job if they really, really want to stay and
| cause havoc.
| trompetenaccoun wrote:
| More importantly it's nothing to billion dollar
| corporations or governments. Money should not equal speech.
| Instead I think that firstly we need a "proof of
| personhood", a way to identify real individual users and
| distinguish them from bots and shills that run multiple
| accounts. As it is now the system can be easily gamed which
| works great for those who have plenty of resources. Genuine
| users get the short end of the stick.
| cvwright wrote:
| I really doubt $5-10 / year would be sufficient to replace
| their ad revenue.
|
| From what I remember from Facebook SEC filings, they make on
| the order of $50-100 per US user per year.
| arkitaip wrote:
| Link to report:
| https://www.yumpu.com/en/document/read/65717082/the-smsmit-r...
| tobr wrote:
| Real PDF rather than crummy web viewer:
| https://www.yumpu.com/en/document/download/65717082/a63ac-8c...
| venamresm__ wrote:
| I love it, it's very good. It touches a lot of things I went
| through when writing this book:
| https://venam.nixers.net/blog/internet_communication_narrati...
|
| I think people agree on a lot of the dynamics in this space and
| what needs to be done.
| codyswann wrote:
| I still remember when I was laughed out of my Masters class on
| Mass Communication for saying that giving everyone a voice might
| result in a net-negative. I'm not sure we're there yet, but
| that's where we're headed.
| betwixthewires wrote:
| I thought from the title that this would be about actually
| changing social media - business models, different UX patterns to
| get rid of dark patterns and antisocial patterns - actual
| interesting ways to fix social media. Instead its more of the
| same old "we need more censorship online" rhetoric. It's a shame,
| there are actually interesting discussions to be had on the
| impacts of social media and the engineering of it, but it seems
| most people are not actually interested in solving the real
| problems.
| qwerty456127 wrote:
| To me it seems the spread of misinformation probably is a self-
| fixing problem. The users have no choice but to develop critical
| thinking about what they read/watch so they probably will and
| this is good. I actually was glad deepfakes emerged as this can
| make the fact you can't just believe everything you see more
| vivid and obvious even to those who were ignorant.
| sys_64738 wrote:
| Studies say social media makes people unhappy. Isn't the cure to
| make us happy pretty simple?
| trompetenaccoun wrote:
| There are also studies that say reading the news makes people
| unhappy. It's not hard to imagine why that is. I think most of
| us agree that social media as it is, is broken. But internet
| forums and social media can also be a massively powerful tool
| for the people to connect and share information from a
| grassroots level.
|
| Just like journalism has become compromised, so has social
| media. The solution isn't to abolish both, instead we need to
| find ways to strengthen their integrity and make them
| independent of partisan corporate or political interests.
|
| https://time.com/5125894/is-reading-news-bad-for-you/
___________________________________________________________________
(page generated 2021-07-04 23:01 UTC)