[HN Gopher] Social Media Platforms as Common Carriers? [pdf]
___________________________________________________________________
Social Media Platforms as Common Carriers? [pdf]
Author : amadeuspagel
Score : 86 points
Date : 2021-07-07 15:23 UTC (7 hours ago)
(HTM) web link (www2.law.ucla.edu)
(TXT) w3m dump (www2.law.ucla.edu)
| uniqueuid wrote:
| By the way, one of the colleagues thanked in this piece is Eric
| Goldman, who has written the highly recommended paper titled
| "Search Engine Bias and the Demise of Search Engine Utopianism"
| [1]. He also frequently writes at [2]
|
| [1] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=893892
|
| [2] https://blog.ericgoldman.org/archives/author/eric-goldman
|
| [edit] While I'm at it, one of the best summaries of the whole
| neutrality debate is from Laura Granka [3], and Goldman has his
| own summary here [4]
|
| [3] Granka, L. A. (2010). The Politics of Search: A Decade
| Retrospective. The Information Society, 26(5), 364-374.
| https://doi.org/10.1080/01972243.2010.511560
|
| [4] Goldman, E. (2011). Revisiting Search Engine Bias (SSRN
| Scholarly Paper ID 1860402). Social Science Research Network.
| https://papers.ssrn.com/abstract=1860402
| advisedwang wrote:
| I like the articles separation between hosting and other
| activities. I think having content available by URL and in the
| feed of friends/subscribers feed but never shown in discovery
| features to the general public is a good compromise between
| allowing expression and preventing extremism proliferating.
| naasking wrote:
| So far it seems like a fairly well balanced analysis, which I
| think addresses the censorship concerns without violating these
| companies' First Amendment rights: I'll begin
| by asking in Part I whether it's wise to ban viewpoint
| discrimination by certain kinds of social media platforms, at
| least as to what I call their "hosting function"--the
| distribution of an author's posts to users who affirmatively seek
| out those posts by visiting a page or subscribing to a feed.
| I'll turn in Part II to whether such common-carrier-like laws
| would be consistent with the platforms' own First Amendment
| rights, discussing the leading Supreme Court compelled speech and
| expressive association precedents [...] And then I'll turn in
| Part III to discussing what Congress may do by offering 47 U.S.C.
| SS 230(c)(1) immunity only for platform functions for which the
| platform accepts common carrier status, rather than offering it
| (as is done now) to all platform functions. On
| balance, I'll argue, the common-carrier model might well be
| constitutional, at least as to the hosting function. But I want
| to be careful not to oversell commoncarrier treatment: As to some
| of the platform features that are most valuable to content
| creators--such as platforms' recommending certain posts to users
| who aren't already subscribed to their authors' feeds--platforms
| retain the First Amendment right to choose what to include in
| those recommendations and what to exclude from them.
| [deleted]
| Majromax wrote:
| To me, the question is how a common carrier status can coexist
| with content curation and promotion.
|
| Looking at the ur-example of the phone company, AT&T doesn't try
| to preemptively place calls that a household "might be interested
| in based on analytics."
|
| The modern social media landscape, however, is dominated by "the
| algorithm." Youtube lives on its recommendations; Twitter wants
| you to keep refreshing its feed; even Hacker News tries to sort
| the front page by a combination of novelty and interest. In my
| opinion, this is a continuous, editorial judgment by the outlet,
| and it is very different from the idea of a "neutral public
| square" that exists as an objective point in space.
|
| The fight isn't really about whether an individual person should
| have a right to send messages to other individual people on
| social media platforms; it's about whether the media platform has
| a responsibility to actively promote views without regard for
| their content. It's not about forcing the "shopping mall" to
| _allow_ protesters ( _PruneYard_ , from the article), but instead
| it's about forcing the "shopping mall" to advertise that it's
| hosting protesters and include them on its maps and business
| lists.
|
| (Edit to add:)
|
| The article discusses this point somewhat in its section E,
| related to compelled recommendations, but I think the topic
| deserves much broader analysis. A "legally viewpoint-neutral
| Twitter" would be a Pyrrhic victory for proponents if Twitter
| could still restrict user's visibility to direct
| subscribers/replies only, denying the _public_ part of its
| platform.
| temp8964 wrote:
| If Twitter hides Trump from everyone doesn't follow him, but
| shows all his messages to everyone follows him. What is the
| problem then?
| wmf wrote:
| The problem _from Trump 's perspective_ is that he was
| getting a ton of engagement from non-followers and he wants
| that level of engagement back.
| slownews45 wrote:
| I think you will be surprised at how DISENGAGED folks are vis a
| vis a phone call on their AT&T line.
|
| Why? Because a damn ton of them are spam / scams. So yes, ATT
| doesn't filter calls, and users have learned not to answer
| them.
| amadeuspagel wrote:
| > A "legally viewpoint-neutral Twitter" would be a Pyrrhic
| victory for proponents if Twitter could still restrict user's
| visibility to direct subscribers/replies only, denying the
| public part of its platform.
|
| I don't agree. I think that would be a huge victory. It would
| certainly not be pyrrhic in the sense that we would lose
| anything.
| klyrs wrote:
| My biggest technical issue with this "common carrier" movement is
| spam. The author mentions it, but seems to suggest that users can
| simply block or ignore spammers. As somebody who witnessed the
| commercialization of the internet, I find this extremely naive.
| We generally think of email as a "common carrier," but even
| there, spam is blocked. Blocking spam is censorship. Failure to
| automatically detect 1% of spam will still overwhelm non-spam
| content.
|
| If I lose common carrier status because I'm blocking spam, and
| thus section 230 protections, am I on the hook if I fail to block
| a fraudulent spam comment that leads to a user being harmed?
| Anon1096 wrote:
| Couldn't a social media site implement support for custom
| blocklists or moderators, and have users opt into them?
| Facebook can have their own Official TM Facebook Moderation
| Blocklist, which users can subscribe to and filter out all spam
| and hate according to Facebook's algorithm. Or users can choose
| not to and subscribe to someone else's or no blocklist at all.
| Therefore, there wouldn't be any issue with common carrier
| status since users can post without being hindered, but other
| users can tailor their feeds to only see posts they care about.
| [deleted]
| tedunangst wrote:
| I think this is a common failing for legal analysis of tech
| problems. Some years ago Lessig made the argument that radio
| bandwidth should be unregulated and belong to all, instead of
| licensed to cellphone carriers. The technical basis for the
| argument was that's how Ethernet hubs work, and they work
| great. Of course, nobody uses a hub today.
| uniqueuid wrote:
| Could you explain what the common failing is?
|
| To underestimate how technical changes affect real-life
| usage?
|
| I often see the opposite fallacy: That tech people assume
| technically optimal solutions to be socially and legally
| optimal, which they often are not. To transfer this to your
| example: Perhaps the social and economic landscape of society
| would be better off if the spectrum belonged to us all. Maybe
| we would have developed legal and social norms to handle the
| hub nature?
| tedunangst wrote:
| Assuming some existing technical solution is optimal. Hubs
| existed for a time because they were cheap, not because
| they were good. Or assuming that some solution to an
| unsolved problem will be pulled out of the magical
| innovation hat. We'll just devise more efficient time
| sharing algorithms.
| uniqueuid wrote:
| Thanks, the first fallacy is great, I had not thought of
| this before. It's similar to how people will not deem
| change possible until it has always existed (i.e. when
| convinced, they overwrite the memory of their own past
| opinion).
| [deleted]
| uniqueuid wrote:
| Well, the common carrier approach would not necessarily mean
| that every user can publish what they want.
|
| Common carrier treatment arises when you control a unique piece
| of infrastructure, or, more loosely defined, a dominant one. As
| a result you need to provide non-discriminating access for
| competitors.
|
| In my interpretation, this requirement would also be satisfied
| by Facebook allowing publishers and companies to create pages.
| A minimal threshold of importance _that 's equal for all page
| owners_ might be valid.
|
| And the paper touches on this, mentioning that there's a valid
| role in moderating comments:
|
| > There might thus be reason to leave platforms free to
| moderate comments (rather than just authorizing users to do
| that for their own pages), even if one wants to stop platforms
| from deleting authors' pages or authors' posts from their pages
| ampdepolymerase wrote:
| Simple, outsource the moderation. Provide a common access API
| for third parties to provide moderation. Whether their home
| feed content resemble 4chan or the New Yorker is entirely up
| to the user.
| temp8964 wrote:
| I don't think this is a major issue at all. All the common
| carriers (phone line, cell phone, public square in shopping
| mall, etc.) designated by law have (or can have) spam problems.
| This does not change a thing. No serious common carrier will
| challenge the court with the argument: I can ban spammers so I
| can ban everyone I want to.
| hpoe wrote:
| I guess part of my question with spam lately is why can't we
| just go after the spammers? It seems to me that acting as a
| public nuisance would be enough for the government to take
| action.
| chrisco255 wrote:
| Because most of the spam is coming from other countries and
| the cost of tracking down every spammer is prohibitive.
| chr1 wrote:
| User has control over what is marked as spam, and can see spam
| messages if needed. If you block some content the way gmail
| blocks spam, it won't be censorship.
| thaumasiotes wrote:
| > If you block some content the way gmail blocks spam, it
| won't be censorship.
|
| Are you thinking of the spam folder? Because the primary way
| that gmail blocks "spam" is by never delivering the email at
| all. Not to your spam folder, not anywhere.
|
| I learned about this when I was unable to receive email from
| a personal friend.
| amadeuspagel wrote:
| Phone companies are regulated as common carriers, but there's
| an exception allowing them "to block by default illegal or
| unwanted calls based on reasonable call analytics before the
| calls reach consumers."[1]
|
| [1]: https://www.fcc.gov/consumers/guides/stop-unwanted-
| robocalls...
| labcomputer wrote:
| But that puts us back where we started:
|
| FB, et. al., will argue that they are blocking unwanted
| messages based on reasonable analytics. So we will continue
| to see exactly the same type of feed curation as today.
|
| The only difference (maybe) is that a platform (probably)
| can't just wholesale ban a particular person... but I bet
| that even the phone company is allowed to ban a person is
| deemed to be sufficiently abusive or harassing.
|
| The fundamental problem is that there is a vocal minority who
| really, _really_ don 't want to believe that they are the
| vocal minority who everyone else wants to ignore. There isn't
| some magic set of laws and regulations you can create to
| force the silent majority to listen to you when they think
| you're dumb and annoying.
|
| The first amendment grants you the right to speak and the
| right for people (who wish to) to listen, but it doesn't
| grant you the right to an audience for your speech.
| naasking wrote:
| > There isn't some magic set of laws and regulations you
| can create to force the silent majority to listen to you
| when they think you're dumb and annoying.
|
| You're not applying the rule correctly. Twitter and FB are
| preventing, say Trump, from _posting at all_ , even to
| those who _want_ to receive his messages, so they aren 't
| just blocking his messages to recipients who don't want to
| receive them. That goes beyond simply "blocking unwanted
| messages".
| amadeuspagel wrote:
| If you really believe that applying common carrier laws to
| social media companies, with the same exception for spam
| that phone companies have, wouldn't make a difference, you
| have no reason to oppose that, right?
| naasking wrote:
| > We generally think of email as a "common carrier," but even
| there, spam is blocked. Blocking spam is censorship. Failure to
| automatically detect 1% of spam will still overwhelm non-spam
| content.
|
| These aren't quite the same. The paper recommends common
| carrier protections only for publishing that people explicitly
| seek out or to whom they subscribe. If you're subscribed to
| spam, the solution is obvious and technically trivial
| (unsubscribe).
|
| The spam problem for email is not trivial because email is, by
| design, a public addressing system which permits unsolicited
| messaging.
| cameldrv wrote:
| In a decentralized system, users could have various "vouches"
| for the authenticity of their messages.
|
| A trusted user could vouch for a new user, and/or the user
| could go through various types of third party verification,
| such as phone number, ID verification, or other verification,
| and then the third party could vouch for them on that basis.
|
| You could even have monetary vouches, where a third party
| vouched that the user paid a certain amount of money, solved a
| captcha, or solved some computational problem.
|
| The user could be in control of what measures would be required
| to reach their screen. As it stands, social media essentially
| incorporates a combination of trusted user vouching (you can
| see who is a friend of a friend), and monetary vouching
| (advertising).
| slg wrote:
| I think the biggest problem is less spam and more harassment.
| Simply blocking harassers isn't as effective as blocking spam
| due to the way social networks generally work. The harasser can
| just move on to your connections which is potentially even
| worse than them harassing you. Imagine an ex-partner sending
| revenge poor to all your contacts. Fighting that requires the
| ability for users to report potential bad actors and for the
| platform to have a centralized moderation mechanic to stop
| those bad actors.
| timoth3y wrote:
| Be careful what you wish for.
|
| If social media platforms are considered common carriers they
| will have a 'duty to serve." They will need to provide their
| service to anyone not judged by a court to be breaking the law.
|
| That includes spam, pornography, and many forms of online abuse
| and harassment. There is a strong case to be made that ISPs
| should be common carriers, since they don't (and should not) know
| what is contained in the packets they transport.
|
| But treating Twitter or Facebook or Hacker News as common
| carriers makes no sense. They are not "carriers" who are
| neutrally transmitting data they have no control over.
|
| That's not what they want to be, and that's not want their users
| want them to be. 4chan has it's place, but most people don't want
| to whole web to look like that.
| Buttons840 wrote:
| If social media becomes a common carrier, but ISPs do not, can
| Facebook just host their service on their own ISP and block
| unwanted users at the ISP level? "It's not Facebook blocking the
| user, it's the ISP, which happens to be the only ISP we host
| Facebook on."
|
| All just poking holes in the idea that social media can be a
| common carrier while the ISPs are not. Somehow the literal
| carrier avoided becoming the common carrier.
| temp8964 wrote:
| How can the ISP ban Facebook accounts, if they are separated
| entities? Or do you mean the ISP bans Facebook urls related to
| certain accounts / posts? Not sure how this gonna work.
| tedunangst wrote:
| Facebook allows anyone to sign up and access the site through
| user.facebook.com. All actual hosting is outsourced to
| NotFacebook ISP because the cloud. NotFacebook totally at
| their own discretion null routes baduser.facebook.com.
| Buttons840 wrote:
| Maybe Facebook's ISP of choice requires handing over the TLS
| private keys?
| supertrope wrote:
| Social media companies only invest enough in trial ISP networks
| to encourage faster speed by incumbents. Telecoms outsource
| physical operations and desperately want to get into value adds
| like media content, low quality app stores, e-security, and
| advertising. No one wants to move down the OSI layers into
| commodity services or capital intensive slogs.
| rcpt wrote:
| > just host their service on their own ISP
|
| Good luck
| singlow wrote:
| That's not how laws work.
| Buttons840 wrote:
| Your right, this legal "hack" would probably be frowned upon
| by a judge, but it does demonstrate how silly it is that the
| literal carrier is not the one considered the "common
| carrier".
|
| It would be illegal for Facebook to arbitrarily prevent
| someone from accessing Facebook, and legal for Comcast to
| arbitrarily prevent someone from accessing Facebook. And back
| to my original proposal, if Facebook and Comcast came to some
| sort of deal, perhaps Comcast could block unwanted user on
| behalf of Facebook.
| amanaplanacanal wrote:
| I'm not seeing what problem is being solved here. If, as he
| points out early in the paper, common carrier status in the US
| would only apply to hosting, but not recommendation engines and
| the like, what is the gain? Anybody can already find hosting
| somewhere. It's the discoverability that the platforms provide
| that's the secret sauce.
| amadeuspagel wrote:
| > If, as he points out early in the paper, common carrier
| status in the US would only apply to hosting, but not
| recommendation engines and the like, what is the gain?
|
| There's something between hosting and recommendations, which
| the paper suggests common carrier status could also be applied
| to, and that's subscriptions. So youtube wouldn't have to
| recommend Alex Jones' videos, but if people subscribe to his
| channel, it would have to show his videos to them.
|
| > Anybody can already find hosting somewhere.
|
| Even if that's the case, taking away their current hosting
| disrupts people's speech.
| [deleted]
| bosswipe wrote:
| So many supposedly principled free-market libertarians and
| conservatives tying themselves in knots so that Trump can
| continue to push his lies. The american conservative movement,
| like nationalists everywhere, have no principles bedsides power.
| And with the new permanent conservative judiciary there are many
| victories in their future. Our democracy is in peril but if you
| look at places like Russia and China they seem stable enough,
| maybe the death of the American world order won't be so bad.
| Animats wrote:
| This can be viewed as an alternative under antitrust law. If you
| get big enough, either you get broken up, or you have to become a
| regulated monopoly. You get to pick.
|
| "Big enough" by EU standards is where there are less than four
| competitors of reasonable size and reach. The US tends to
| tolerate a higher threshold. Amy Klobuchar's "Antitrust" book
| suggests 40% market share as the threshold.
| throwawaysea wrote:
| > If you get big enough, either you get broken up, or you have
| to become a regulated monopoly.
|
| Some products cannot be easily broken up without disrupting the
| actual product. For instance, Facebook could not be broken up
| since the value of the product is in large part due to having
| one significantly sized and unified user base (network
| effects). But these companies can't have it both ways - they
| can't claim that they operate in a competitive environment with
| few barriers for new competition while also claiming that
| splitting up their user base would destroy their unique product
| offering.
|
| We also need to be wary of market share arguments, especially
| given that these companies largely operate in the Bay Area and
| reflects its values/political culture/etc. This is why we
| regularly see them enact censorship in lock-step. Even if
| several companies operate with less-than-majority market share,
| they can behave as a cartel. That's why we shouldn't treat
| Twitter and Facebook and Tik Tok as alternatives to each other.
|
| A better alternative might be to simply envision and implement
| new regulations based on minimum user bases. If your user base
| is larger than X (to be defined) then you are subject to
| regulations. Some suggestions could be user bases larger than
| the [smallest or largest] state by population. A social media
| platform that has more influence and power than a state
| government seems like a reasonable target for regulation.
| joe_the_user wrote:
| I'm in favor of regulated monopolies under some circumstances
| but making Facebook or any information-filter-platform into
| regulated monopoly sounds nightmarish. I'd sort of trust the US
| government, at its best, to do some things. But I would never
| trust any state regulate content filtering.
|
| Beyond that, making sure it's always possible to host content
| on the Internet would be the most reasonable way guarantee free
| speech. And content filters would have competition this way as
| a content filter is a kind of content.
| uniqueuid wrote:
| A very important point. Antitrust has entirely different
| conditions to be met. The EU conditions are even more
| complicated than < 4 competitors and are evaluated in-depth in
| each case. At the same time, antitrust law has been attempted
| as a mechanism for ensuring freedom of information, although
| largely unsuccessfully (due to internal growth; EU antitrust
| law is predominantly geared towards blocking acquisitions
| afaik).
|
| Now, _media regulation_ is an entirely different regulatory
| approach: It has more normative roots, and in some cases is
| even precautionary: Allowing sanctioning in the face of
| plausible threats. That 's a very sharp sword and the reason
| why platforms fear it.
|
| [edit] For completeness, the definition of "big enough" in the
| EU is called dominant position and is defined in the Hoffman-La
| Roche case. It means being so powerful that competitors can no
| longer act independently. (https://eur-lex.europa.eu/legal-
| content/EN/TXT/?uri=CELEX:61...)
| tedunangst wrote:
| Would a regulated monopoly be subject to community decency
| rules? Would twitter be fined every time somebody posts Janet
| Jackson's nipple?
| temp8964 wrote:
| I don't think market share is much relevant here. Small cell
| phone carriers can ban customer text message conversation based
| on political opinion, but not AT&T?
|
| On page 8:
|
| > Why does the law preclude the companies from doing this--even
| when they're not monopolies, such as landline companies might
| be, but are highly competitive cell phone providers?
| throwawaysea wrote:
| Social media above a certain size should be treated as common
| carriers. There are no reasonable alternatives to them, and
| today, common public activities activities that are core to our
| life are conducted on these large platforms. This is not a matter
| of protecting the freedom of companies to do as they wish - we
| already regulate companies and restrict their activities in many
| ways. Private power utilities cannot discriminate against their
| customers based on their speech or political viewpoints, for
| example. The same regulations can be enacted to govern these
| platforms.
|
| I often see arguments saying that someone who is
| deplatformed/demonetized on these service can just use an
| alternate service, but I find that to not be the case in
| practice. Consider that Twitter, Facebook, and YouTube have more
| users than virtually all _nations_. Their network effects are
| core to what the product is, which is why there aren 't suitable
| alternatives (especially when they enact censorship in unison).
| Telling someone to just go use a different platform is like
| telling someone that they don't need their power utility, since
| they can just stick a windmill on their property instead.
|
| Finally, I am greatly concerned that these large privately-
| controlled platforms are essentially outsourcing government-
| driven censorship and also violating election laws. For example
| when conservatives did form their own platform on Parler, AOC
| called for the Apple and Google app stores to ban Parler after
| the Jan 6 capitol riot (https://greenwald.substack.com/p/how-
| silicon-valley-in-a-sho...). If a sitting member of the
| government pressures private organizations to censor others, it
| should be considered a violation of the first amendment. Leaving
| aside the technicalities of law, it is unethical and immoral even
| otherwise and completely in conflict with classically liberal
| values. Actions taken by these companies to suppress certain
| political speech in this manner also amount to a donation to the
| other side. This isn't recognized as "campaign funding" but it is
| probably more effective than campaign funding at this point. We
| need to do a better job of recognizing the gifts-in-kind coming
| out of Silicon Valley tech companies towards political parties
| based on the ideas they suppress/amplify/etc.
| Server6 wrote:
| Here's the thing, you don't NEED social media. It's not a
| utility. Your quality of life won't diminish if you don't have
| access to it. I suggest deleting it and not using it for a few
| months and see how you feel. I 100% guarantee your mental
| health will improve.
| throwawaysea wrote:
| I understand where you're coming from, and I probably would
| agree that mental health would improve. But the reality is
| that the majority of political discourse today happens online
| on these digital platforms. It is only going to get more
| digital as Gen-Z comes of age. So while my immediate short-
| term mental health might improve, I also feel that not being
| present on social media means that I would be participating
| less in our democratic process, my views would be less
| represented, and in the long-term my quality of life may
| change in a negative way as a result. It's also why I think
| the ban of Donald Trump is unacceptable - for all practical
| intents and purposes, these social media platforms now
| gatekeep the most fundamental (political) processes
| underlying society.
| slownews45 wrote:
| The contribution of social media to our "democratic
| process" is to poison it.
|
| Too many twitter warriors banging away on keyboards.
|
| You AND SOCIETY will benefit if you check out of it and
| just meet your neighbors.
| dane-pgp wrote:
| > You AND SOCIETY will benefit if you check out of it and
| just meet your neighbors.
|
| Not if your neighbors are staying at home consuming
| corporate/government approved social media messages.
| uniqueuid wrote:
| For context: This paper argues that online platforms constitute a
| sort of "infrastructure" similar to utilities, and that they
| should be regulated to guarantee equal access.
|
| This idea is not new - it has been discussed under different
| terms, e.g. with relation to "must carry"-rules for cable
| providers.
|
| To simplify a bit, the argument boils down to the question of
| liability and responsibility for content curation. Platforms
| either curate content and are liable; in this scenario, they may
| be held accountable within media regulatory norms. Or they don't
| curate and provide equal access, then they aren't liable and free
| from media regulatory normative standards.
|
| Note that the US' pretty exceptional take on free speech (as an
| untouchable right) very much complicates this simplified
| description.
| no-dr-onboard wrote:
| Having only read the introduction of this paper myself, this
| seems to be an apt summary of its contents.
|
| Also worth noting that the language in this paper is very
| approachable and rife with footnotes that often point to past
| legal decisions, if not precedents. Even if you don't agree
| with the thesis of this paper, it still makes for an eloquent
| and informative read of one side of the aisle.
| uniqueuid wrote:
| Well, it is a scholarly law paper.
|
| [edit] To clarify: Being a scholarly law paper means that (1)
| it is well-documented and thoroughly researched, but also (2)
| it may employ words that seem to have a common-sense meaning
| but really don't. Things such as "fair", "bias", "access",
| "responsible" etc. have very precise legal meanings that are
| not readily apparent.
| jasode wrote:
| _> , and that they should be regulated to guarantee equal
| access._
|
| You mentioned the UK BBC in another comment. Here are some
| examples of BBC censorship:
| https://en.wikipedia.org/wiki/Censorship_in_the_United_Kingd...
|
| If the government-run BBC sometimes censors certain topics, I'm
| confused as to how that _same government_ becomes a watchdog &
| enforcer over private corporations designated as "common
| carrier" and not censor.
|
| In other words, what higher power over the UK government forces
| them not to censor? There's an inherent contradiction enforcing
| an "equal access" law because the government itself doesn't
| follow it. This has unavoidable effects on government
| regulation of the private corporations it oversees.
|
| The "common carrier" designation is easy to implement when
| communication is _point-to-point_ with paid subscriptions (e.g.
| telephones) -- instead of _broadcast_ funded with ads or
| government taxes (e.g. Facebook /Twitter, BBC). There is no
| government in the world that allows _broadcast_ of any topic
| without interference.
| amadeuspagel wrote:
| > The "common carrier" designation is easy to implement when
| communication is point-to-point with paid subscriptions (e.g.
| telephones) -- instead of broadcast funded with ads or
| government taxes (e.g. Facebook/Twitter, BBC). There is no
| government in the world that allows broadcast of any topic
| without interference.
|
| The paper deals with the supposed distinction between point-
| to-point and broadcast. In brief, it's not very clear.
| Consider the postal service: If millions of people subscribe
| to a magazine, is that point-to-point communication or
| broadcasting? How is it different from millions of people
| subscribing to a youtube channel? Or following someone on
| twitter?
| uniqueuid wrote:
| This is a tangential argument, but of course government-
| provided media are not devoid of censorship/bias. That's
| because nobody is. The market is _also_ not devoid of
| censorship /bias, neither in its supply of information nor in
| the resulting consumption.
|
| Public service broadcasters are not about alleviating
| censorship; they are primarily a means of _providing_ a solid
| base level of access to information. They cannot reasonably
| provide access to all information.
|
| Your remarks on oversight are a very valid concern; that's
| why Germany, for example, has publicly funded (not state-
| funded; they get a mandatory fee from citizens over which the
| state has no control) but state-independent public service
| institutions.
|
| There's much more to the argument, but a central point here
| is the shift from regulating supply to regulating
| consumption. One can trivially argue that platforms are
| harmless because any censorship they implement is not _total_
| in the sense of absolute government censorship - you 're free
| to publish your stuff elsewhere.
|
| But today, more and more countries are seeking to ensure
| healthy _consumption_ of information, and to enable this,
| they need to intervene in platforms ' content curation.
|
| It's a dangerous argument, of course, because this is
| precisely what totalitarian states are doing. But - and this
| is important - regulating content is something that
| democracies have always had to do, e.g. banning libelous
| content, revenge porn etc. etc.
| HPsquared wrote:
| It's more like publishing than broadcast. Somewhere in
| between those two I think.
|
| Edit: the difference comes in where there are feeds and
| recommendations popping up to users who haven't previously
| subscribed to something. That and the volume/media.
| Causality1 wrote:
| I think this is only an issue because there is no public
| digital infrastructure. We need an "internet post office" whose
| only rules are the law, operates at-cost for users, and is
| completely free of liability. For example, you can't sue the
| post office if someone mails a pirated DVD.
|
| Would it be used almost exclusively by extremists and lunatics?
| Probably, but I still think it's worth having if only in light
| of all the HN stories about how some account slip-up at Google
| utterly ruined someone's life. Everyone deserves an email
| address they can never lose access to.
| cvwright wrote:
| Interestingly, there are hints that some countries are
| looking into providing Matrix accounts for all of their
| citizens.
|
| I don't think there's been a formal announcement yet, but it
| was mentioned in one of their Matrix Live videos a couple of
| weeks ago.
|
| Personally, I see pluses and minuses. Yay, free Matrix for
| everyone. Boo, your government can monitor everything you do
| on that server.
| uniqueuid wrote:
| Whether this is a good idea is seen differently in different
| countries.
|
| Countries with strong public service broadcasters (e.g. the
| BBC in the UK) tend to consider state-run information
| infrastructure a good idea, at least in a dual system
| including private corporations.
|
| Countries without public service media see private marked
| actors as perfectly sufficient.
|
| Needless to say, the US is squarely the second type.
| dageshi wrote:
| Isn't that sort of basically email now? If you sub to the
| mailing list of "extremists and lunatics" then you'll
| probably get your email on extreme and lunatic issues.
|
| The issue is, the extremists and lunatics don't just want
| email, they want twitter, facebook and all the inherent
| amplification capabilities of both. Not just a 1 to 1
| message, not even a 1 to many messages but full on
| advertising to potentially interested users the same as the
| cat pics/videos get.
| satyrnein wrote:
| Do extremists and lunatics want their email handled by the
| government?
| ergot_vacation wrote:
| The advantage of the paper is that it has a great deal of
| detail and research, done apparently by someone with actual
| knowledge of the law and case history in question. Usual
| discussions of the common carrier or public space argument tend
| to stop at a surface level because everyone involved in the
| discussion only has basic knowledge and training, so it's
| interesting to see it more carefully examined.
|
| As just one example, it argues that social media companies are
| not necessarily protected from being compelled to host content
| they disagree with simply because it would be "compelled
| speech," as stated in the rejection of recent Florida
| legislation. There are a number of existing cases where
| entities were compelled to do just that because they were
| operating a public space, even if privately owned (a shopping
| mall for example). Agree or disagree, this is information I
| wasn't aware of (and probably a lot of readers here as well),
| so it's interesting information to have.
___________________________________________________________________
(page generated 2021-07-07 23:01 UTC)