[HN Gopher] Control as Liability
___________________________________________________________________
Control as Liability
Author : timdaub
Score : 30 points
Date : 2021-12-24 13:00 UTC (2 days ago)
(HTM) web link (vitalik.ca)
(TXT) w3m dump (vitalik.ca)
| Animats wrote:
| Well, yes, in the money area. It's generally accepted in finance
| that custody implies responsibility. It's taken a while for that
| to penetrate to the crypto sector. The earlier players, lacking
| assets, desperately tried to evade their responsibilities for
| other people's money. Now there are some players you can actually
| find and sue if they screw up.
| saurik wrote:
| This same kind of thought process--that one should strive not for
| "do no evil" but "can't do evil"--really applies everywhere, as
| it is somewhat general: amassing control over other people and
| their resources (money, data, whatever) is always going to be
| dangerous.
|
| Maybe you are good today, but in the future you might start to be
| swayed by changing incentives or situations due to forces such as
| "absolutely power corrupts absolutely".
|
| Or maybe you manage to always be good; but, as humans have fixed
| life spans, eventually retire or die or simply move on and are
| replaced by someone who is less good than you are.
|
| Or maybe you are good but the power you manage to concentrate
| gets stolen by someone (in the digital world, maybe you get
| hacked) and used without your permission to do bad things.
|
| Or maybe you want to be good, but your power is seen as an asset
| for something external--such as a government--and you end up
| being required to do bad things that make you sad.
|
| We see all of these issues play out constantly with large tech
| companies, with control techniques such as curated application
| markets getting abused as anti-competitive measures, or getting
| regulated by authoritarian governments as a tool for their
| regime.
|
| In 2017, I gave a talk at Mozilla Privacy Lab that looked at many
| of these issues, citing tons of situations--every slide is a
| screenshot of a news source, as somehow people always want to
| believe these situations are far-fetched--where having control
| has gone badly:
|
| https://youtu.be/vsazo-Gs7ms
| meheleventyone wrote:
| But there is also "can only do evil" where you accidentally set
| up a system you can't fix.
| saurik wrote:
| If users agree that the thing you have built is broken and
| want to opt into using a new less-broken thing, they can
| always do that, as would be the case (say) with software they
| download from you and are running on their computer (rather
| than software you host on your server, that you can change at
| will): you don't need the power to "reach your grubby mitts"
| --to put it bluntly--into their lives and fix what you built
| on their behalf in order for broken things to be fixed, and
| your definition of what is "broken" can easily be at odds
| with the user's preferences or even needs (which of course
| begs the entire question of how to avoid the incentive to be
| evil in the first place).
| pphysch wrote:
| Power as Responsibility
| chubot wrote:
| Similar idea as "big data" as a toxic asset.
|
| https://www.schneier.com/blog/archives/2016/03/data_is_a_tox...
|
| Once you collect data, it can become very attractive to various
| parties, like nation states and snooping employees. Google found
| this out the hard way multiple times!
___________________________________________________________________
(page generated 2021-12-26 23:00 UTC)