[HN Gopher] 'Contextualization Engines' can fight misinformation...
___________________________________________________________________
'Contextualization Engines' can fight misinformation without
censorship
Author : laurex
Score : 51 points
Date : 2021-04-26 16:38 UTC (6 hours ago)
(HTM) web link (aviv.medium.com)
(TXT) w3m dump (aviv.medium.com)
| standardUser wrote:
| This already exists. Unless I am missing something, we've already
| been seeing these on major platforms like Facebook and Instagram
| (if not others) for at least a year now. I've noticed it mostly
| on posts about the 2020 election and COVID-19 facts/news.
|
| I mostly like the approach. It seems like the least restrictive
| and least objectionable solution (or partial solution) to an
| incredibly tricky and important problem.
| JI00912 wrote:
| I maybe misunderstanding this post, but Facebook information
| about covid and the last us election has been terrible.
| standardUser wrote:
| I'm talking specifically about the little info blurbs that
| Facebook automatically attaches to posts about COVID. They
| are links to factual information. I'm not talking about
| information that other people share about COVID on Facebook.
| the_lonely_road wrote:
| I spent a good amount of time trying to build a contextualization
| engine for a different reason (auto correct, sick to death of it
| being so bad when I'm clearly talking in a very specific context
| bubble using these same words that everyone else is using over
| and over), and man can I say that is is extremely difficult to
| get 5 people in a room to not define 5 different context buckets
| for a specific piece of text in a given situation.
|
| In the end I decided that the real world phenomenon I was trying
| to model was probably closer to a Google+ multiple identifies
| than it was to a single channel like Facebook.
| adamrezich wrote:
| whenever I hear talk about "creating context" with regards to the
| vast amounts of conflicting information on the Internet, I can't
| help but recall the infamous scene near the end of 2001's Metal
| Gear Solid 2: Sons of Liberty, wherein protagonist Raiden talks
| to an AI that has kind of actually been behind everything up to
| this point (keeping in mind that this game was released long
| before the modern social media-oriented Web) (emphasis mine):
|
| ---
|
| AI: The mapping of the human genome was completed early this
| century. As a result, the evolutionary log of the human race lay
| open to us. We started with genetic engineering, and in the end,
| we succeeded in digitizing life itself. But there are things not
| covered by genetic information. [...] Human memories, ideas.
| Culture. History. Genes don't contain any record of human
| history. Is it something that should not be passed on? Should
| that information be left at the mercy of nature? We've always
| kept records of our lives. Through words, pictures, symbols...
| from tablets to books... But not all the information was
| inherited by later generations. A small percentage of the whole
| was selected and processed, then passed on. Not unlike genes,
| really. That's what history is [...] But in the current,
| digitized world, trivial information is accumulating every
| second, preserved in all its triteness. Never fading, always
| accessible. Rumors about petty issues, misinterpretations,
| slander... All this junk data preserved in an unfiltered state,
| growing at an alarming rate. It will only slow down social
| progress, reduce the rate of evolution. Raiden, you seem to think
| that our plan is one of censorship.
|
| Raiden: Are you telling me it's not!?
|
| AI: You're being silly! _What we propose to do is not to control
| content, but to create context._
|
| Raiden: Create context?
|
| AI: The digital society furthers human flaws and selectively
| rewards the development of convenient half-truths. Just look at
| the strange juxtapositions of morality around you. Billions spent
| on new weapons in order to humanely murder other humans. Rights
| of criminals are given more respect than the privacy of their
| victims. Although there are people suffering in poverty, huge
| donations are made to protect endangered species. Everyone grows
| up being told the same thing. "Be nice to other people." "But
| beat out the competition!" "You're special." "Believe in yourself
| and you will succeed." But it's obvious from the start that only
| a few can succeed... You exercise your right to "freedom" and
| this is the result. All rhetoric to avoid conflict and protect
| each other from hurt. The untested truths spun by different
| interests continue to churn and accumulate in the sandbox of
| political correctness and value systems. Everyone withdraws into
| their own small gated community, afraid of a larger forum. They
| stay inside their little ponds, leaking whatever "truth" suits
| them into the growing cesspool of society at large. The different
| cardinal truths neither clash nor mesh. No one is invalidated,
| but nobody is right. Not even natural selection can take place
| here. The world is being engulfed in "truth." And this is the way
| the world ends. Not with a bang, but a whimper. We're trying to
| stop that from happening. It's our responsibility as rulers. Just
| as in genetics, unnecessary information and memory must be
| filtered out to stimulate the evolution of the species.
|
| Raiden: And you think you're qualified to decide what's necessary
| and not?
|
| AI: Absolutely. Who else could wade through the sea of garbage
| you people produce, retrieve valuable truths and even interpret
| their meaning for later generations? _That 's what it means to
| create context._
|
| ---
|
| pretty prophetic for a 2001 video game
| https://www.youtube.com/watch?v=C31XYgr8gp0
| marcodiego wrote:
| There is something that can fight misinformation without
| censorship: a mix a culture + education. Scientific method,
| critical thing, logic, statistics coupled with a bunch of
| historical examples of non-intuitive results and demonstrating
| situations where correlation does not implies causation should be
| teach on schools repeatedly and from early age.
|
| Most adults don't even know what a double blind study is. It is
| not surprising that so many people are falling prey to covid
| quackery.
| keiferski wrote:
| I don't think many people working in tech understand the nature
| of the Internet when it comes to information. The cat is out of
| the bag and there is no returning to a world of universal
| "authoritative sources" (term used in the article.)
|
| Framed historically, we are experiencing a similar situation to
| the invention of the printing press. All attempts to reel in
| "misinformation" by appealing to legacy media outlets are doomed
| to fail, precisely because these outlets have lost the trust they
| once had. I don't see them regaining it in the near future, if at
| all.
|
| Instead, the likely outcome is a constellation of reputation-
| based media sources. "Joe Brown is always honest, so I'll see
| what he says in his podcast" is probably the future source of
| information for the average person. Fact checkers will never be
| anything other than a biased opinion of a particular institution,
| as been shown time and time again.
| noofen wrote:
| Has anyone considered that the "rise of misinformation" and
| Qanon-type conspiracy discourse is just a symptom of elite and
| institutional decadence?
| WalterGR wrote:
| _elite and institutional decadence_
|
| What does that mean? I googled "institutional decadence" but
| am still not clear on the definition. Could you give some
| examples of elite and institutional decadence?
| keiferski wrote:
| Not the OP, but this just refers to the gradual erosion of
| many institutions in American society. Universities, media,
| political structures, etc. have all lost the trust of the
| populace over the last ~30 years, if not longer.
|
| Whether this is from an actual lowering of standards or
| just an inevitable consequence of new media is a different
| question. I'd suggest reading works by a critic named John
| Simon for the former opinion.
| WalterGR wrote:
| Hmm... When I think of decadence I think of Marie
| Antoinette allegedly asking "Why don't they just eat
| cake?" Like, actual decadence. In this context is
| "decadence" being used as a loaded word? At university I
| never saw professors or administrators popping Cristal
| and making it rain hundos, for example.
|
| (Edit: For the record, I didn't downvote you. One can't
| downvote a comment one is responding to.)
| keiferski wrote:
| I haven't really followed the Q-Anon thing well enough to
| know much about it, but it definitely does strike me as a
| consequence of institutional decadence.
|
| I'm not sure the content of the conspiracy is actually
| relevant. Most people _want_ to trust the information they're
| told, so the popularity of such things does seem like a
| symptom and not a cause.
| barbazoo wrote:
| Education, to me, seems like the perfect way to "fight
| misinformation without censorship". Teach our kids how to read,
| how to understand, how to contextualize and think critically. We
| don't need big tech to come up with a solution to a problem that
| big tech caused to some degree.
| potatoman22 wrote:
| As much as I hated it as a kid, I think literary analysis and
| semiotics is really beneficial in this manner.
| narag wrote:
| I agree but you also need to teach them that neither side of
| the political divide is right on everything. Good luck getting
| support for that.
| barbazoo wrote:
| Believing in there being 2 sides is probably a good indicator
| that we've already failed.
| narag wrote:
| There are more than two in my country, I was just
| "translating"...
| amalcon wrote:
| Let's suppose I am Google. I am in arguably the best position in
| the world to build this system, because I have PageRank. PageRank
| is (for the sake of argument) the closest thing yet invented to
| an unbiased measure of authoritativeness (though still not _that_
| close). I also have the most comprehensive map of the Web ever
| made, because that 's the input to PageRank.
|
| I could imagine building a contextualization machine out of this.
| Oversimplified version: Look for cases where an authoritative
| link frequently appears next to a link to the page in need of
| context, and surface those authoritative links.
|
| The good news / problem is that you can already use google this
| way. If you google a given URL, you'll get results of people
| talking about it! E.g. the current top article on HN is:
|
| https://www.apple.com/newsroom/2021/04/ios-14-5-offers-unloc...
|
| If I google this, I get a Reddit thread discussing this, two
| misses (MacRumors frontpage and a CNBC index page), and a Korean
| page. The last might be fine, but I can't read it so I don't
| know. Maybe the Apple announcement is just too new, but this
| doesn't seem super useful.
|
| Of course, maybe there are refinements one could make for this
| use case. Maybe this technique is just more useful for political
| or science topics. I don't know. I tend to doubt it, though.
| ergot_vacation wrote:
| The problem of course is that a certain crowd runs most large
| sites, and as seen on Youtube, if things they dislike become
| too popular they will do whatever they can to suppress that
| popularity.
|
| Put another way, there are two definitions of "Authoritative":
| what does the raw data suggest, and what does the echo chamber
| I've surrounded myself with and become convinced is reality
| say. I think we all know which standard would be used if such a
| system was implemented (and of course, it's already been
| attempted in small ways).
| motohagiography wrote:
| Interesting that it's possible, but it's probably not ethical to
| misdirect people about the intent of this. Some people experience
| what we might see as misinformation as their culture and
| identity, and pretending we aren't building technologies to
| detect and exterminate it is pretty dodgy. Unless we think these
| tools will only ever be in the hands of 'good' people?
|
| Being able to detect the tribal flavor of information so that you
| can filter it out is what got us into this mess in the first
| place because it indexed on creating outrage bubbles for clicks.
| Growth comes from opportunity, which usually comes from something
| being shitty, so militating and cleansing the world of badness
| isn't doing anyone any favors except consolidating the role of
| the cleaners.
|
| When you ask the question, "whose contextualization engine?" and
| the whole premise falls apart, it becomes clear they have earned
| any mockery these engines are designed to sheild them from.
| lindy2021 wrote:
| The real test of a fair system is to build your
| "contextualization engine" and then hand it over to your
| ideological enemies to run.
| MayeulC wrote:
| Would you hand the wheel over to your enemies? Agriculture?
| Writing?
|
| Probably not, if they are assured to remain your enemies.
| Wars have almost always been decided on technological
| capabilities alone, with a few counter-examples. You asked an
| interesting question, if only a bit narrow.
|
| Going further, most technologies can be directly used for
| ill.
| max-ibel wrote:
| I cut -- you choose ?
|
| I like it ;) But it will probably never happen.
| toss1 wrote:
| When your "culture and identity" is based on relying on blatant
| disinformation, whether it is that the earth is flat, the govt
| is poisoning you with chemtrails, or NASA is fake, that is
| already getting corrossive to society.
|
| When it extends to anti-vaccine disinformation, it is an active
| threat not only to public health but national security. Many of
| these types of disinformation are actively weaponized and
| spread by adversaries.
|
| It is right to threat those kinds of threats to the society and
| nation as we would treat any physical threat -- actively work
| to neutralize it.
|
| Just because war is moving out of the physical and into the
| information sphere, does not make it less serious.
| motohagiography wrote:
| The principle I'm appealing to is that we have to ask whether
| it's appropriate or desirable to have that technology applied
| to us(*). It's not disinformation if it's just outgroup
| culture.
|
| It seems unwise to imagine oneself engaged in a war because
| it just licenses your opponent. The idea of war is that it
| the extension of policy, which in every other circumstance
| can be negotiated, but if instead of negotiating policy,
| these same people are engaged in a deception to exterminate
| or subordinate an opposition, that's not policy, that's just
| a power struggle for its own end, and it's not something to
| be reconciled.
|
| If you want to treat fellow citizens as an enemy force, say
| so, and they will come to the table as one. But if you want
| to just decieve them, I'd wonder whether we were the baddies.
| We probably don't want to make civilian technologists legit
| targets in this war either.
|
| Surely the people developing or advocating these techs are
| capable of enough self awareness to recognize how every
| example of "blatant disinformation," has an equivalent worthy
| of targeting, so introducing them as anything other than
| abstractions doesn't really yield perspective.
| visarga wrote:
| > If you want to treat fellow citizens as an enemy force,
| say so
|
| Dividing people is a bad idea because we have much more in
| common than is different between us. We should always try
| to be more inclusive and tone down the class wars and
| cancellations. We probably won't make any social progress
| while throwing stones into the opposing tribes instead of
| working together.
|
| But you can't be inclusive of falsehoods, you can only open
| to legitimate concerns raised by the various groups,
| because any solution we find needs to be viable in the real
| world.
|
| I don't trust revolutionaries - I've been burned before,
| and also read history. Revolutionaries, social warriors and
| activists love revolution more than peace itself. If we
| want to succeed we need to win the peace, not the war.
| freeflight wrote:
| _> It 's not disinformation if it's just outgroup culture._
|
| Just because wrong information can become outgroup culture
| does not suddenly make the wrong information true.
|
| Particularly when it stipulates world views that vilify
| whole other groups as allegedly being responsible for
| everything wrong.
|
| The end result is not just a culture that might endorse
| violence against others, but something that could probably
| be described as a cargo cult where wrong information
| becomes dogmatic tradition.
|
| Antisemitism has already taken such a shape, most if not
| all of the theories voiced by some modern outgroups are
| just reboots of allegations that are centuries old and have
| justified violence for just as long.
|
| Case in point: The whole "elites abducting children to
| sacrifice them" theory, which regained popularity trough
| QAnon, is just a more modern take on the William of Norwich
| blood libel [0].
|
| That has been around for close to a millennium because it's
| an idea that survives in certain religious outgroup
| cultures. Do we really have to tolerate the intolerance to
| be considered tolerant? Is there really no rough line that
| can be drawn?
|
| [0] https://en.wikipedia.org/wiki/William_of_Norwich
| motohagiography wrote:
| I'd say that Popper's Paraphrase isn't a useful argument
| when it reduces to just a first-accusers advantage. It's
| a small mercy that he died before seeing what his work
| would be reduced to.
|
| Tolerance is meaningful when it is mostly unbearable,
| otherwise it's just preference. That idea is elaborated
| well in SSC's "I can tolerate anything but the outgroup."
| The relative truth of wrong information is a glass house
| I don't think advocates of censorship are intellectually
| equipped to defend. Their only tool is deception, and
| inevitably force, and the points we can score on them
| have no value.
|
| The deeper controversy is whether on the internet we will
| accept a rules based order, or discretion. It's the old
| "rule of law vs. rule of men" issue. What has changed in
| the last 140 years or so is of course, total, instant
| globalization, with the internet, but more subtly, that
| the language used to reason about these things has itself
| become unmoored.
|
| What I think we should all reflect on is how as
| individuals we can become a bit less enthusiasitically
| murderable to one another, before worrying about how we
| can change what others think. It's the one problem we can
| all make progress on.
| gedy wrote:
| This sounds "right" but then how would you feel applying this
| things like native American tribal beliefs, e.g. regarding
| genetics and migration. Those may be scientific facts but
| many reject them due to fearing further dispossession of
| their land, etc.
| pessimizer wrote:
| You're clearly an information warrior, but what if I think
| that you're the one spreading misleading information?
|
| It's weird that you fully expect that you, or someone else
| who agrees with you, will be in control of this.
| toss1 wrote:
| Whether you think that I'm spreading disinformation, or
| whether it is someone else is easily determined by looking
| to the external facts.
|
| The fact is that the earth is (nearly) round, contrails are
| not chemical sprays, Capricorn One was a C-grade movie
| while Apollo-11 went to the moon, and that anti-vax
| movement was started by a fraud attempting to discredit
| existing formulas to promote a formula in which the paper's
| author had a vested interest (and which paper was
| retracted, along with his MD and license) and that vaccines
| do work.
|
| That is the point. A society cannot remain cohesive if
| there are no shared truths.
|
| You may not want to take it seriously, but I'd suggest you
| consider the following quotes, and why and how
| disinformation can be weaponized.
|
| "Anti-intellectualism has been a constant thread winding
| its way through our political and cultural life, nurtured
| by the false notion that democracy means that 'my ignorance
| is just as good as your knowledge.'" -- Isaac Asimov
|
| "You are entitled to your opinion. But you are not entitled
| to your own facts.". -- Daniel Patrick Moynihan.
| motohagiography wrote:
| > A society cannot remain cohesive if there are no shared
| truths.
|
| On this we agree. This truth needs to be on a scale of
| one, some, or all truths that are shared. Let's
| disqualify all, and simplfy to finding an example of one.
|
| The other thing about the truth is that it probably
| shouldn't be falsifiable, as it's a bit of an all-in bet
| on whatever that is. It shouldn't be a complicated
| mystery because then you are back to the relative problem
| of who has discovered more and how do you know. The
| perfect example truth is the one that is not knowable
| either way, but something that can be believed.
|
| We should probably recognize that destabilizing that
| example shared truth is an attack on social cohesion as
| well. Can't think of an example at the moment, but I
| certainly agree it would be the One Important Thing you
| would need to believe, or you're basically left with a
| kind of hellscape.
| BitwiseFool wrote:
| Everyone needs to pick and choose which
| information/disinformation battles they want to take part in.
| Flat-Earthers, Chemtrail People, Moonlanding Hoaxers, these
| people aren't really worth anyone's time. How does someone
| else believing the world is flat impinge on your ability to
| live your life?
|
| Now, I can understand wanting to combat vaccine hesitancy
| because that does tangibly affect society at-large (recent
| measles outbreaks). But most of the disinformation boogeymen
| you listed are just people with kooky ideas who aren't
| persuading society.
|
| Lastly, it's okay to be wrong about things. The body of human
| knowledge is so vast that everyone is bound to be more wrong
| than right.
| toss1 wrote:
| There is a difference between 1) simply being wrong by
| mistake vs 2) deliberate and maintained ignorance vs. 3)
| deliberately deceiving people and poisoning the agora.
|
| Mere ordinary wrongness is a big enough problem. And to the
| degree that it is mere cultural ignorance, it can be merely
| annoying.
|
| But intentionally spreading disinformation, fear porn, and
| cultivating false outrage to inflame divisions in society
| is much more serious, and should not be conflated with mere
| buffoonery.
| keiferski wrote:
| This line of thought will both be extremely ineffective (and
| further incentivize misinformation) and highly unjust. At
| some point you need to stop and realize everything isn't a
| conflict that necessarily has losers to be "neutralized."
| toss1 wrote:
| Agree, it only becomes a conflict when some parties are
| deliberately weaponizing disinformation.
|
| Mere ignorance is something we must unfortunately tolerate.
|
| Active work to spread disinformation that will literally
| kill people - see [1] or just search for it, where a mere
| dozen people are responsible for 65% of the spread of
| disinformation about the COVID-19 vaccines, and are working
| to deliberately undermine vaccination efforts.
|
| This isn't mere ignorance, it will get people killed. And
| before you say something like "well the fools that believe
| that idiocy deserve what they get", consider that while
| they may (sort of their choice), the family, friends,
| neighbors, first responders, and healthcare workers do not
| deserve to be exposed unnecessarily to those biohazards.
|
| [1] https://www.theguardian.com/world/2021/jan/06/facebook-
| insta...
| kodah wrote:
| > Agree, it only becomes a conflict when some parties are
| deliberately weaponizing disinformation.
|
| Agree, it only becomes a conflict when both major parties
| are deliberately weaponizing disinformation.
|
| I think other commenters have warned about this. It's
| fairly easy to tell what political allegiances you have.
| Just understand that your parties spread a lot of
| disinformation as well. As an independent, it makes me
| cringe to watch liberals act like they aren't responsible
| for loads of disinformation, hyperbole, fear mongering,
| and misinformation too. Much less that it doesn't have
| Russian origins.
| keiferski wrote:
| Every single revoking of civil liberties (e.g.
| censorship) has been done in the name of safety. That
| doesn't make it acceptable.
|
| If you want to combat misinformation, pick a better
| solution. And as I said above, censorship is pretty much
| impossible, both legally and technologically. It simply
| cannot be done.
| toss1 wrote:
| Did you even read the article?
|
| The ENTIRE POINT of the idea is to NOT censor anything,
| but to add context -- adding information, not censoring
| it. Letting people make their own decisions.
|
| The problem here is that there is a massive asymmetry.
| This has been noticed for centuries with the maxim "A Lie
| Can Travel Halfway Around the World While the Truth Is
| Putting On Its Shoes" (some interesting history on that
| at [1]).
|
| This is because our brains are tuned to seek out novelty,
| and to seek out information on threats.
|
| This enables those who want to manipulate us and
| especially the masses -- creating fake news and fear porn
| will get the attention of pretty much everyone, and
| anyone who is not both actively better informed at the
| time, and actively thinking, will be inflamed and
| motivated.
|
| Voltaire pointed out that "Those who can make you believe
| absurdities, can make you commit atrocities."
|
| The people spreading disinformation treat that as an
| instruction instead of a warning.
|
| Adding this kind of contextual information seems a far
| better solution than censoring.
|
| [1] https://quoteinvestigator.com/2014/07/13/truth/
| keiferski wrote:
| Yes I read the article, but I was replying to your
| comment, not the article's argument. I replied to that
| directly in another comment.
|
| You seemed to be arguing for censorship, which is what I
| was replying to.
| toss1 wrote:
| Please read more carefully
|
| I am absolutely not arguing for censorship, especially
| since censorship creates it's own blowback, which is
| often even more helpful to the disinformation.
|
| I am merely arguing that disinformation needs to be
| fought, especially actively weaponized disinformation.
|
| It looks like this contextualization could be a very good
| method.
| andrewjl wrote:
| > Being able to detect the tribal flavor of information so that
| you can filter it out is what got us into this mess in the
| first place because it indexed on creating outrage bubbles for
| clicks.
|
| What you're referring to as outrage bubbles has come about due
| to a lack of contextualization. One topic this is commonly seen
| is discussion on GMOs, where both pro-GMO and anti-GMO
| activists provide misleading facts as supporting evidence in
| their discussion. In this example and also when it comes to
| politics, a contextualization engine would be very effective in
| spurring dialogue and getting people to see multiple sides of
| an issue.
|
| > When you ask the question, "whose contextualization engine?"
| and the whole premise falls apart, it becomes clear they have
| earned any mockery these engines are designed to sheild them
| from.
|
| Where I personally differ from some of the social networks on
| this, is I think these should be default on but also come with
| an off switch. Meaning the user always has the ability to turn
| them off, but they need to take an active step in order to do
| so. IMO it's a good compromise.
| gweinberg wrote:
| There are pro-GMO activists?
| groby_b wrote:
| Can we stop saying it's due to "lack of contextualization" as
| if it's something that just oddly happened?
|
| The reason we lack context is that the world moved from
| chronological feeds to "engagement" based feeds. Anything
| that could have served as context gets optimized out, because
| more clicks. We're not naturally polarized - we have created
| an environment that favored the more outrageous claims,
| slowly removing the middle ground as if it didn't exist.
|
| What we need isn't a "contextualizion engine", it's getting
| rid of active efforts to decontextualize.
| wmf wrote:
| Yeah, people won't react well when the fact checker tells them
| that _everything they read_ is fake news.
| endymi0n wrote:
| "If conspiracy theorists become convinced that their facts
| are fake, they will not abandon conspiracy theories. They
| will reject the authority of the fact checkers."
|
| - Loosely paraphrased after David Frum
|
| https://www.reddit.com/r/LeopardsAteMyFace/comments/mmy39q/o.
| ..
| JI00912 wrote:
| It would be great if we could all just turn to an objective
| entity with all the facts. But there's no such person.
| Certainly not the average journalist. There's just no such
| thing as a fact checker.
| sofixa wrote:
| Everyone has their biases, but most are able to look over
| them in the face of facts. And the good thing about facts
| is that they're objective and indisputable.
|
| Moon landing ? Fact. Trump being US president? Fact.
| France winning the 2018 World Cup? Fact.
|
| So no, fact checkers do exist and are very important.
| generalizations wrote:
| As long as the fact checkers stick with facts that are as
| obvious as the ones in your examples, then sure: they're
| objective and indisputable.
|
| However, people do dispute facts, and anything someone
| disputes is no longer in the purview of the fact checker,
| _by your definition_.
| argvargc wrote:
| All of those examples could be invalidated with new
| information.
|
| As inconvenient as it may be, truth is transitory and
| progress toward it is frequently made through
| invalidation. It's more prudent and useful to assume
| facts don't exist.
|
| As for "fact-checkers", who fact-checks _them_? (Etc...)
| The notion of visiting a webpage authored by God-knows-
| who to check whether what 's on another webpage authored
| by God-knows-who is correct or not is absurd.
|
| And we haven't even gotten into subjectivity or nuance.
| msla wrote:
| Is the idea that Jews are literal monsters part of someone's
| culture and identity?
|
| Is that culture and identity something the rest of the world
| can live with?
| lindy2021 wrote:
| Yes. Jews have been treated as literal monsters for millennia
| and the rest of the world has lived with it.
|
| https://en.wikipedia.org/wiki/Expulsions_and_exoduses_of_Jew.
| ..
| giantg2 wrote:
| "Is the idea that Jews are literal monsters part of someone's
| culture and identity?"
|
| _The idea that some people may believe that_ is central to
| some cultural identities. For example the Hasidic community,
| or the defense /military posture of Israel.
| ketzo wrote:
| For such a long piece, this is disappointingly light on any
| details or discussion about _how_ one would go about building a
| "Contextualization Engine."
|
| I mean, we all saw what happened with Microsoft's Tay Bot [1] --
| any system that attempts to factor in user input is ripe for
| abuse. Who would control what context is valid, and what context
| is discarded? If you decide to factor in "all" content, how do
| you rank it? Is it simply a factor of popularity, or are trusted
| sources given more weight? Who is a trusted source?
|
| I don't mean to sound overly pessimistic: I agree with the author
| that context is the biggest thing lacking from the internet
| today, and is a valuable tool for fighting misinformation. But
| any discussion like this needs to pretty immediately grapple with
| the intricacies of how to implement context -- and what context
| really _is._
|
| [1] https://en.wikipedia.org/wiki/Tay_(bot)
| throwawayboise wrote:
| Sounded like it was mainly reliant on some pre-selected
| whitelist of approved sources. Of course that raises the
| question of who approved those sources, and what were their
| motivations?
| giantg2 wrote:
| "The contextualization engine compares the content being shared
| with that from authoritative sources and provides articles or
| other media results that are sufficiently related."
|
| How are these authoritative sources chosen? This sounds like it
| would be rife for manipulation. Plus, just relying on an appeal
| to authority as the validation of the information will not weed
| out mistakes or errors in the content.
|
| What are the standards for these other media results it will
| provide? News articles, blogs, studies, etc. Some might be better
| than others depending on the topic. The news and journalists are
| supposed to be the old school contextual engine. How do we see an
| automated version being any better quality?
| Fellshard wrote:
| It already is. Watch the Twitter 'trending' pane over time, and
| you'll notice that they add context in exactly one worldview
| direction, incurring massive editorialized anchor bias.
| ergot_vacation wrote:
| Exactly. The fundamental problem with the phrase
| "misinformation" is that what is or isn't "misinformation" is
| _subjective_. Truth and reality aren 't subjective, but
| people's beliefs about the nature of them are, very much so.
| It's impossible to solve the problem by asking "The Experts" to
| adjudicate what is and isn't true, because The Experts have
| become too mired in corruption, greed, political intimidation
| and so on. They no longer represent anything even close to an
| impartial party, and thus are not useful tools for discerning
| reality anymore, at least in the traditional way of asking a
| question, getting an answer and accepting it.
| tryonenow wrote:
| >Systems built specifically for contextualization might not only
| support media literacy; they could also provide the data needed
| for fact-checkers to determine what to focus on, and could even
| help support the emotional literacy relevant to avoid harmful
| reactions to misinformation (from lashing out at loved ones, to
| terrorist radicalization).
|
| All I'm hearing is techno-authoritarians building a system to
| further suppress dissenting views. Even "authoritative" sources
| are supposed to disagree; that's how science works.
|
| I think the uncertain nature of reality is such that any
| "authoritative" source is likely to eventually make mistakes, and
| building rigid systems to hide alternative viewpoints is a form
| of social disenfranchisement.
| drngdds wrote:
| Why would this be any more trusted than the misinformation
| warnings Youtube and Twitter already put up automatically for
| certain subjects?
|
| And also, I don't know how anyone can even float the possibility
| of making something like this a legal requirement. The ways it
| could be abused are really obvious.
___________________________________________________________________
(page generated 2021-04-26 23:02 UTC)