[HN Gopher] Florida House of Representatives approves bill to ba...
       ___________________________________________________________________
        
       Florida House of Representatives approves bill to ban social media
       for kids < 16
        
       Author : vinnyglennon
       Score  : 276 points
       Date   : 2024-01-29 13:20 UTC (9 hours ago)
        
 (HTM) web link (abcnews.go.com)
 (TXT) w3m dump (abcnews.go.com)
        
       | akerl_ wrote:
       | This makes me think of how, as a child, every site asked "are you
       | over 13", and I diligently clicked "yes". Some more clever sites
       | asked for my birth year... forcing me to do the arduous work of
       | taking the current year and subtracting 14.
       | 
       | Though I suppose the real plan here is to pass the law and then
       | have the government selectively prosecute social media companies
       | for having users under 16.
        
         | ratg13 wrote:
         | Just a guess .. but I get the feeling the 'law' is more to get
         | this in the mind of the parents.
         | 
         | It gives parents tools and guidelines that can help them direct
         | their children.
         | 
         | Whether this is a good approach or not, is a whole other
         | argument.
        
           | Consultant32452 wrote:
           | It also gives the parents an excuse to limit their children.
           | 
           | "I'm not being mean, it's the law."
        
             | mrguyorama wrote:
             | Anyone who turned on a dome light in a car during the night
             | knows you don't need an actual law on the books to do that
        
           | akerl_ wrote:
           | The legislature is claiming out loud that the law is because
           | parents don't have the ability to do this on their own.
        
         | tnbp wrote:
         | Some more clever sites asked for my birth year... forcing me to
         | do the arduous work of taking the current year and subtracting
         | 14.
         | 
         | But why? You could have just picked a year that worked, and
         | sticked with it. Obviously, there's no way of telling _which_
         | year works, but you could have bruteforced that just once.
        
           | m2fkxy wrote:
           | probably the thrill of living on the edge.
        
             | shermantanktop wrote:
             | Arithmetic under stress is a helluva drug.
        
           | justinclift wrote:
           | > Obviously, there's no way of telling which year works
           | 
           | Um, it's simple maths. Guessing you're meaning something else
           | though?
        
         | Dalewyn wrote:
         | > have the government selectively prosecute social media
         | companies for having users under 16.
         | 
         | The US government is already legally mandated to prosecute
         | companies known to harbor information, collected online,
         | concerning minors less than 13 years old without consent from
         | their parents or legal guardians.[1]
         | 
         | It's why Youtube blocks comments and doesn't personalize ads on
         | videos published for kids, to pick out a prominent example.
         | 
         | [1]:
         | https://en.wikipedia.org/wiki/Children's_Online_Privacy_Prot...
         | 
         | Obligatory IANAL.
        
           | lupire wrote:
           | Laws are getting stricter. Around the world, there is
           | increasing regulatory requirement for businesses to actively
           | investigate user behavior (tracking!) to identify and exclude
           | underage users who are concealing their age.
        
         | idontwantthis wrote:
         | I think it's better to give the media companies a requirement
         | and let them figure it out rather than government mandate how
         | they do it.
        
           | akerl_ wrote:
           | What if we didn't give them a mandate and then they didn't
           | figure it out? The fix for social media is not "wait til 17
           | to expose kids to it".
        
         | AlecSchueler wrote:
         | I don't think enforcement actually needs to be very tightly
         | controlled. The barriers that are put in place like the one you
         | describe are already enough to create a social milieu where
         | parents and kids with think twice about these things and
         | understand that there is a recognised harm potential.
         | 
         | There's nothing stopping you pouring your youngsters a glass of
         | wine with dinner, but as a society we've made the dangers of
         | alcohol and similar things so well understood that no parent
         | wants to.
        
           | WarOnPrivacy wrote:
           | > as a society we've made the dangers of alcohol and similar
           | things so well understood that no parent wants to.
           | 
           | Unfortunately, as a society, we have a much harder time
           | grasping social media threat data. I suppose some of that is
           | due to how news orgs consistently+bizarrely+hugely
           | overstating the actual harms in the data.
           | 
           | https://www.techdirt.com/2024/01/08/leading-save-the-kids-
           | ad...
        
             | AlecSchueler wrote:
             | Oh totally! What I'm saying is that laws like this, even
             | when not enforceable per se, can help move society towards
             | that understanding.
        
         | riffic wrote:
         | it was always appropriate to lie and make up a fake age / date
         | of birth on those signup pages.
        
         | rtkwe wrote:
         | If the year was a scroll box I was always about a flick and a
         | half old which often ends up in the 60s so they had some odd
         | age metrics.
        
         | ok123456 wrote:
         | A small transaction would cover 99% of cases (e.g., pay a
         | dollar that's immediately refunded). It would stop kids from
         | casually creating accounts. The kids who can do this are
         | already precocious enough to bypass any other verification
         | steps you could come up with.
         | 
         | Maybe if they use a profile pic that you algorithmically
         | determine is someone underage, you could do some additional
         | checks. The smart ones would learn not to utilize a profile pic
         | of themselves, which would ultimately be better.
        
           | rndmnflnce wrote:
           | You could even shave a fraction of a penny off with each
           | transaction and they would never notice!
        
         | wrsh07 wrote:
         | Lol I just always added 10 to my birth year
        
         | dotnet00 wrote:
         | Yeah, similarly I had just gotten used to entering an elder
         | sibling's birthday whenever asked. Adding these arbitrary age
         | restrictions does nothing but make it increasingly obvious to
         | kids how little our leaders and other supporters of these
         | arbitrary age restrictions actually care.
        
       | pfdietz wrote:
       | I can't see how this could pass 1st amendment scrutiny.
        
         | goda90 wrote:
         | How would it be different than age restricting voting, driving,
         | or alcohol and tobacco sales? It seems there's precedent for
         | treating minors differently than adults in many ways.
        
           | nickjj wrote:
           | > age restricting voting, driving, or alcohol and tobacco
           | sales
           | 
           | I think those things are a bit more black and white.
           | 
           | How do you define social media? Is HN social media? You
           | engage with others through comments, posts are ranked and it
           | has voting elements as well as your profile has a gamified
           | score for upvotes. Is a Disqus comment on any website social
           | media if that's how we broadly define social media? Where do
           | we draw the line?
           | 
           | You could make a case that leaving reviews on a site are a
           | form of social media too. You can post something there and
           | feel like you need to check back in hopes someone leaves a
           | like or replies. If it were a wearable item you might take a
           | picture of yourself and now hope people engage with it.
        
             | bhpm wrote:
             | The social media platforms the bill would target include
             | any site that tracks user activity, allows children to
             | upload content or uses addictive features designed to cause
             | compulsive use.
        
               | mccrory wrote:
               | Then they can say goodbye to most online gaming. I wonder
               | how people will feel about this when they realize their
               | kids can't play games anymore?
        
               | Ekaros wrote:
               | Pretty good. Pretty good. I would go to argue that online
               | games do more harm than good to children and teenagers.
               | In many ways banning them from playing them might steer
               | them to more productive use of time.
        
               | serialNumber wrote:
               | Must children's use of time be "productive"? They have
               | their whole lives to be productive - outright banning
               | video games is not the solution in my eyes
        
               | mbreese wrote:
               | At the time, Fortnite and Minecraft (and more) were my
               | son's way to socialize with friends. That was how they
               | hung out (pre Covid). I can honestly say it would have
               | been detrimental to him from a mental health perspective
               | if those outlets didn't exist or if kids were blocked
               | from them.
               | 
               | Drawing the lines between what types of media are and
               | aren't allowed is a major issue with this type of law,
               | regardless of if you think it's a good idea or not.
        
               | Dalewyn wrote:
               | According to another comment[1], video games are
               | exempted.
               | 
               | [1]: https://news.ycombinator.com/item?id=39176380
        
               | mccrory wrote:
               | This exempts almost everyone and everything. lol
        
               | nickjj wrote:
               | That covers a massive range.
               | 
               | "I hope someone likes my review" is not that much
               | different than "I hope someone likes my tweet" or "I hope
               | someone replies to my IG post" or "I hope someone replies
               | to my HN comment".
               | 
               | All 4 scenarios trigger the same thing which is setting
               | up a future expectation that's hopefully met while you
               | wait with anticipation of the event. Is that the process
               | they are trying to get a handle on?
               | 
               | Personally I don't know how any of that could get
               | enforced. Even making the internet read-only wouldn't
               | work because that wouldn't stop people from internally
               | comparing themselves to someone else who is allowed to
               | post. Although that type of thing has been going on since
               | advertising existed.
        
               | bhpm wrote:
               | What products are children reviewing?
        
             | zo1 wrote:
             | You could come up with corner cases and odd delineations
             | for those former examples too. Yet we all know the gist of
             | the laws and manage to somehow, on the aggregate, prevent
             | kids from engaging in said activity.
             | 
             | E.g. alcohol. How do we stop kids form drinking alcohol?
             | How do you define alcohol? What about kids medicine that
             | includes alcohol, what about medical procedures that insert
             | substances with alcohol content. What about shops that get
             | conned by kids with fake ID. What if the label gets
             | scratched off and kid doesn't know it had alcohol. What if
             | a parent gives their child a sip of beer. What about old
             | home remedies that include whiskey. What about colic
             | medicine that has alcohol, okay What about carving
             | exceptions for babies, etc etc etc.
             | 
             | Granted some of those are contrived, but it's not as black
             | and White as you think.
        
             | internet101010 wrote:
             | I would define it as a platform where the primary focus is
             | for users post visual content of themselves.
        
           | erellsworth wrote:
           | I think the difference would be in enforcement and what that
           | would imply. How do you verify someone's age online without
           | exposing that person to unwarranted tracking? Do we want to
           | just say social media anonymity is dead?
        
           | pfdietz wrote:
           | There's no constitutional right to drive, smoke, or drink
           | alcohol.
           | 
           | As for voting, it's constitutionally mandated that 18 or
           | older can vote, which implies there's no constitutional
           | mandate for anyone younger.
        
             | cooper_ganglia wrote:
             | Minors do not have full constitutional rights when it comes
             | to free speech. We've had SC precedence for this for 50+
             | years, thanks to Ginsberg v. New York.
             | 
             | >"In Ginsberg v. New York, 390 U.S. 629 (1968), the Supreme
             | Court upheld a harmful to minors, or "obscene as to
             | minors," law, affirming the illegality of giving persons
             | under 17 years of age access to expressions or depictions
             | of nudity and sexual content for "monetary consideration."
             | 
             | >Judge Fuld of the Nassau County District Court had
             | convicted Sam Ginsberg, who owned a small convenience store
             | in Long Island, New York, where he sold "adult magazines"
             | and was accused of selling them to a 16-year-old boy on two
             | occasions."
             | 
             | It's not good for kids to have unrestricted access to
             | pornography in the same way that it's not good for them to
             | have access to unrestricted social media.
        
           | summerlight wrote:
           | The legal restriction needs to be very specific and
           | demonstrate a good trade-off between societal gain and harms
           | on the individual's right, not something a clean cut like
           | many folks misunderstand.
           | 
           | Alcohol and Tobacco are pretty straightforward in the public
           | health context. And you probably don't want your toddlers to
           | drive a car given the risk. The voting age is an exception as
           | it's defined in the 26th amendment. I consider it as
           | political compromise though.
        
         | dpkirchner wrote:
         | There's no downside to passing unconstitutional bills. They
         | just need to become law and remain in effect long enough to be
         | useful in electoral campaigns. When the courts strike the law
         | down, they'll be painted as activist justices and yadda yadda
         | we've seen all this before.
        
         | Ekaros wrote:
         | 2nd amendment seems to already been over turned. Children
         | cannot own for example fully automatic weapons. And in some
         | states firearms in general.
        
           | _dark_matter_ wrote:
           | > 2nd amendment seems to already been over turned. Children
           | cannot own for example fully automatic weapons.
           | 
           | Just as the founding fathers intended, children brandishing
           | fully automatic firearms. James Madison is turning in his
           | grave!
        
           | lsiunsuex wrote:
           | The 2nd amendment is very much alive and kicking. Those of us
           | that would like to defend ourselves will fight for 2A.
           | 
           | To your point, children cannot own weapons. Most states are a
           | minimum 18 years old for if not 21.
           | 
           | Full automatic weapons have been banned from civilian
           | production for decades now. Semi automatic weapons are not
           | weapons of war; are not the death machines non-gun owners
           | think they are.
           | 
           | I was very anti gun until we (the wife and I) had a need to
           | defend ourselves. We were literally put in a self defense
           | position. We're both all in now and actively train ourselves
           | and educate ourselves. More so, friends that are anti gun are
           | no longer anti gun once they go to the range with me and are
           | educated on what a semi automatic weapon actually is.
           | 
           | TL;DR - anyone anti gun is simple uneducated in the matter.
        
         | oblio wrote:
         | I'm confused, what does this have to do with freedom of speech?
         | 
         | Also, do kids even <<have>> freedom of speech?
        
           | alistairSH wrote:
           | Freedom of speech has ~2 definitions in the US... first, the
           | legal definition (with the Constitution and related case
           | law). A second, the general notion of being literally free to
           | make speech however/whenever one wants without intrusion for
           | anybody at all (government or otherwise).
           | 
           | In this case, we're sort of at the edge of the legal
           | definition. Social media can be viewed as the modern version
           | of the town square (can be, isn't 100% proven out in law
           | yet). If one takes that statement as valid, then the
           | government cannot regulate speech through social media
           | without a very good reason.
           | 
           | But, minors don't have the exact same rights as adults for
           | various reasons (guns, alcohol, privacy at school, etc).
           | 
           | My general impression (IANAL)... the government can likely
           | limit minors' speech on media platforms, however that limit
           | would have to very specific and tightly defined so as not to
           | deny speech to adults. The devil is in the details
           | (implementation)... the legality probably hinges on what
           | method is required to verify age.
        
         | mrstumpy wrote:
         | Children don't have the same standing with the constitution as
         | adults. I don't remember the exact term but children have their
         | rights restricted all the time. They do not have free speech in
         | school for instance. They certainly have limits to their second
         | amendment freedoms. So I don't think this bill will have
         | constitutional issues. Personally, I'm all for a federal
         | restriction on addictive social media for kids.
        
           | pfdietz wrote:
           | So, a law preventing children from receiving religious
           | instruction would also be constitutional, in your mind?
        
             | etiennebausson wrote:
             | Spirituality (and religion) are personal choice. There is
             | no "religious education" for children, only indoctrination.
             | It may be legal, but it is in no way moral.
        
               | pfdietz wrote:
               | This is not an answer to the question I asked.
        
         | 2OEH8eoCRo0 wrote:
         | I can think of a few reasons. This doesn't prevent children
         | from criticizing the government. The 1st amendment doesn't
         | guarantee a platform. Children aren't legally the same as
         | adults.
        
         | insane_dreamer wrote:
         | I'm not sure 1st amendment legally applies to minors since they
         | are legally under their parent's jurisdiction.
        
         | perihelions wrote:
         | Here's the current status of state cases, if anyone's curious:
         | 
         | - _" FIRE's lawsuit asks Utah to halt enforcement of the law
         | and declare it invalid. Other states -- including New Jersey
         | and Louisiana -- are proposing and enacting similar laws that
         | threaten Americans' fundamental rights. In Arkansas and Ohio,
         | courts have blocked enforcement of these laws."_
         | 
         | https://www.thefire.org/news/lawsuit-utahs-clumsy-attempt-ch...
         | ( _" Lawsuit: Utah's clumsy attempt to childproof social media
         | is an unconstitutional mess"_)
         | 
         | I.e. at least four states' legislatures have passed laws like
         | this, federal injunctions have paused two. Two others (Utah,
         | Louisiana) aren't in effect yet.
        
         | gnicholas wrote:
         | Minors don't have the same First Amendment rights as adults.
         | For example, government-run schools can have speech codes that
         | students must obey. Although I have not read this bill, the
         | general notion blurs the line between speech and action a bit,
         | which would make it easier to pass muster.
        
           | piperswe wrote:
           | My understanding is that minors _do_ have first amendment
           | rights, and school restrictions on speech can only apply to
           | instances which disrupt the learning environment.
        
             | gnicholas wrote:
             | Yes, they do have First Amendment rights, just not to the
             | same degree as adults. This is the same as other rights --
             | they can't buy guns, vote, etc. either. The point is that
             | it's definitely not a slam dunk case as claimed. I'd love
             | to hear a constitutional lawyer weigh in, if there are any
             | here. I used to be a lawyer, but this was never my
             | specialty.
        
               | shrimp_emoji wrote:
               | No way. They're essentially property.
               | 
               | If someone could extralegally ground me with no devices
               | if I say something they don't like (provided that it's
               | speech protected under the First Amendment -- since not
               | all speech is), I essentially have no First Amendment
               | right.
        
               | gnicholas wrote:
               | I'm not sure what you mean. Can you explain?
        
       | goda90 wrote:
       | I'm still on the fence about government doing a parent's job
       | here, especially for kids under 13, but I can't stand that no one
       | pushing these bills has come up with an actually reasonable age
       | verification method.
        
         | bamboozled wrote:
         | You could say the same about smoking I guess ?
         | 
         | For kids under 13 to see any of the content , ask them to enter
         | a credit card ?
        
           | LeifCarrotson wrote:
           | Adults should not have to enter a credit card to, say, read
           | HN. But kids should. And therein lies the problem...
        
             | thsksbd wrote:
             | Why not? Honestly asking here.
             | 
             | Let's assume oposition to the law is a "progressive"
             | position:
             | 
             | If there is a constitutional right to absolutely 100%
             | friction free access to information then what happens to
             | all the barriers the government has erected to access
             | covid, Trump, Russia and other "disinformation" progressive
             | pushed for?
             | 
             | (You can invert this example for a right wing if you want)
        
               | nickthegreek wrote:
               | Not everyone has a credit card. Some people cannot obtain
               | a credit card. People under the age of 18 can also have a
               | credit card. I do not trust random sites with my credit
               | card.
        
               | thsksbd wrote:
               | So these are objections in practice, not principle.
               | Important but consider:
               | 
               | - Most states give free IDs
               | 
               | - Your safety concern is addressed by other commenters
               | here (see the verifying age anonymously)
        
               | Tadpole9181 wrote:
               | How about this: I don't want every little thing I do on
               | the internet tracked and tied to my real identity?
               | 
               | Is the CCP conducting a psyops on HN right now or
               | something? Since when were we all for every tiny
               | interaction you have on the internet requiring you to
               | look in the scanner and say "My name is X and I love my
               | government and McDonalds"?
        
               | CaptainFever wrote:
               | HN seems to have become a lot more mainstream, in
               | comparison to the old cyberlibertarian "privacy and
               | piracy" days.
        
               | Tadpole9181 wrote:
               | Where's the new hangout?
        
               | djfdat wrote:
               | Gut check was that the "Most states give free IDs"
               | statement wouldn't hold up. So I checked real quick.
               | 
               | "At least eight states issue free or discounted IDs to
               | low-income or homeless residents and at least 10 states
               | waive ID fees for seniors."
               | 
               | That's far from most states, and even then it comes with
               | stipulations.
        
               | nickthegreek wrote:
               | You changed the argument from credit card to government
               | Id (an even worse idea imo).
               | 
               | You seem to want the law to be that I need to show ID to
               | enter most internet establishments. I will never, ever,
               | ever hold that opinion.
        
               | frumper wrote:
               | Growing up I remember the trope about showing papers to
               | do anything in the USSR. It was in direct contrast to the
               | display of freedom in America.
        
               | jMyles wrote:
               | > If there is a constitutional right to absolutely 100%
               | friction free access to information then what happens to
               | all the barriers the government has erected to access
               | covid, Trump, Russia and other "disinformation"
               | progressive pushed for?
               | 
               | ...those barriers go away. They never really existed in
               | the first place in any real way. Like the Great Firewall,
               | they were a polite fiction defining what people are
               | allowed to know, but were trivially circumvented from
               | minute zero.
               | 
               | This is one of the most reliable and desirable features
               | of the internet in the first place.
        
         | Goronmon wrote:
         | _I 'm still on the fence about government doing a parent's job
         | here, especially for kids under 13, but I can't stand that no
         | one pushing these bills has come up with an actually reasonable
         | age verification method._
         | 
         | How do you anonymously verify someone's age?
        
           | goda90 wrote:
           | Anonymous credentials. A central authority with verified age
           | information of each person grants credentials that verify the
           | age to third parties, but the authentication tokens used with
           | the third party can't be used by the third party nor the
           | central authority to identify anything else about the
           | credential holder.
        
             | bonton89 wrote:
             | This is technically possible but politically impossible.
             | Any system you make like this will get special government
             | peaking exceptions added making it non-anonymous and
             | probably rank corruption from industry lobbying will add
             | some sort of user tracking for sale with data that is
             | poorly anonymized. Once the sham system is in place they'll
             | probably expand the requirement to other things.
        
             | thomastjeffery wrote:
             | Then there is a data breach, and _every person in the
             | country_ is de-anonymized.
             | 
             | No thank you.
        
               | tzs wrote:
               | A data breach where?
               | 
               | The central authority should be someplace that already
               | has your non-anonymous ID data, so using your ID for age
               | verification doesn't give them any new ID information.
               | The only new thing that them doing age verification adds
               | is that they might keep a list of verification tokens
               | they have issued.
               | 
               | Someone who obtained copies of the verification tokens
               | you requested might go to various social media sites and
               | ask them who used those tokens, allowing matching up your
               | social media identities with your real identity.
               | 
               | That's fixed by making it so the token that is given to
               | the social media site is not the token that came from
               | site that checked your ID. You give the social media site
               | a transformed token that you transform in such a way that
               | the social media site can recognize that it was made from
               | a legitimate token from the ID checker but does not match
               | anything on the list of tokens that the ID checker has
               | for you.
        
               | thomastjeffery wrote:
               | Would you care to elaborate?
        
               | tzs wrote:
               | Say you have a user U who wishes to demonstrate to site S
               | that they are at least 16, and we have a site G that
               | already has a copy of U's government ID information.
               | 
               | Here's one way to do it, with an important security
               | measure omitted for now for simplicity.
               | 
               | * S gives U a token.
               | 
               | * U gives G that token and shows G their ID.
               | 
               | * G verifies that U is at least 16, and then signs the
               | token with a key that they only use for "over 16" age
               | verifications. The signed token is given back to U.
               | 
               | * U gives the signed token back to S.
               | 
               | If G saves a list of tokens it signs and who it signed
               | them for, and S saves a list of tokens it issues and what
               | accounts it issued them for, then someone who gets both
               | of those lists could look for tokens that appear on both
               | in order to match up S accounts with real IDs.
               | 
               | To prevent that we have to make an adjustment. G has to
               | sign the token using a blind signature. A blind signature
               | is similar to a normal digital signature, except the the
               | signer does not see the thing they are signing. All they
               | see is an encrypted copy of the thing.
               | 
               | With that change a breach of G just reveals that you had
               | your age verified and gives the encrypted token
               | associated with that verification. These no longer match
               | what is in the records of the sites you proved your age
               | to since they only have the non-encrypted tokens.
               | 
               | Someone with both breaches might be able to match up
               | timestamps, so even though they can't match the tokens
               | from S directly with the encrypted tokens from G they
               | might note that you had your age verified at time T, and
               | so infer that you might be the owner of one of the S
               | accounts that had a token created before T and returned
               | after T.
               | 
               | This would be something people trying to stay anonymous
               | would have to be careful with. Don't go through the full
               | signup as fast as possible--wait a while before getting
               | the token signed, and wait a while before returning the
               | signed token. Then someone who is looking at a particular
               | anonymous S account will have a much larger list of items
               | in the G breach that have a consistent timestamp.
               | 
               | Also note that to G it is just being asked to sign opaque
               | blobs. Occasionally have G sign random blobs. If your G
               | data shows that you are getting your age verified a few
               | times a month, then it is even more likely that if one of
               | those verifies is at about the same time as a particular
               | social media signup it is just a coincidence.
        
               | hn_acker wrote:
               | > The central authority should be someplace that already
               | has your non-anonymous ID data, so using your ID for age
               | verification doesn't give them any new ID information.
               | The only new thing that them doing age verification adds
               | is that they might keep a list of verification tokens
               | they have issued.
               | 
               | But the central authority, a third party, will get a
               | heads-up every time someone - whether child or adult -
               | logs into the social media site. That's a privacy
               | violation. Even if the verification system were set up in
               | such a way that the third party wouldn't be able to know
               | which exact website I'm trying to visit, the third party
               | would be able to track how frequently I visit websites
               | that require age verification. With just this law, it
               | would be "you visited social media during X, Y, and Z
               | times." With extensions of this law to other kinds of
               | websites, it would be "you visited social media or porn
               | or violent video games or alcohol sites during X, Y, and
               | Z times", which obfuscates the kind of website I visit
               | but also makes the internet into something I have to whip
               | out an ID for just to use.
               | 
               | > That's fixed by making it so the token that is given to
               | the social media site is not the token that came from
               | site that checked your ID. You give the social media site
               | a transformed token that you transform in such a way that
               | the social media site can recognize that it was made from
               | a legitimate token from the ID checker but does not match
               | anything on the list of tokens that the ID checker has
               | for you.
               | 
               | Is it possible to transform the token such that the
               | social media site would be able to link it to your
               | identity but an attacker who gains access to the social
               | media site's data wouldn't? If so, I'd appreciate an
               | example of a transformation for such a purpose. But it
               | doesn't wipe out my privacy concern, that I - or anyone
               | else - wouldn't be able to log in to a social media site
               | without letting a third party know against my will.
        
               | kaibee wrote:
               | > But the central authority, a third-party, will get a
               | heads-up every time someone - whether child or adult -
               | logs into the social media site. That's a privacy
               | violation.
               | 
               | Why would you do age verification on login? It only needs
               | to happen once on account creation.
        
               | hn_acker wrote:
               | > Why would you do age verification on login? It only
               | needs to happen once on account creation.
               | 
               | Oops. That slipped my mind. For sites that require log-
               | in, my previous comment is wrong.
               | 
               | I had unconsciously assumed that at least one site would
               | implement the age verification system without requiring
               | users to make accounts to browse the site. In this
               | comment, I will make explicitly make that assumption. For
               | sites without log-in walls but with government-mandated
               | age verification, the concerns in my previous comment
               | would apply. But sites with log-in walls have their own
               | privacy problems independent of age verification, chief
               | being that having to log in means letting the first party
               | site track how often I use the site. A different problem
               | (not necessarily privacy-related, but can be) of log-in
               | walls is that I would be forced to create accounts. If I
               | don't wish to deal with the burden of making accounts,
               | then I won't browse the website. If the website made a
               | log-in wall in response to an age verification mandate
               | from a government, then my First Amendment right to
               | access the speech the website wished to provide will have
               | been chilled.
        
               | bigfishrunning wrote:
               | > But the central authority, a third party, will get a
               | heads-up every time someone - whether child or adult -
               | logs into the social media site.
               | 
               | Why? i imagine this could be a "they've signed my key"
               | situation, no requests needing to go up the tree further
               | then necessary...
        
               | jlokier wrote:
               | > But the central authority, a third party, will get a
               | heads-up every time someone - whether child or adult -
               | logs into the social media site. That's a privacy
               | violation. Even if the verification system were set up in
               | such a way that the third party wouldn't be able to know
               | which exact website I'm trying to visit, the third party
               | would be able to track how frequently I visit websites
               | that require age verification.
               | 
               | It doesn't have to work like this.
               | 
               | It's technically possible to do verification such that
               | the authority (probably the government which already has
               | a database with your age), doesn't get _any_
               | communication when verification takes place. They 'd have
               | no idea which sites you visit or join, or how often.
               | 
               | And the site which receives the verification token
               | doesn't learn anything about you other than your age is
               | enough. They don't even learn your age or birthday. They
               | couldn't tell the government about you even if
               | subpoenaed.
               | 
               | (But if you tell them on your birthday that you are now
               | old enough, having been unable to the day before, they'll
               | be able to guess of course so it's not perfect in that
               | way.)
               | 
               | Using modern cryptography, you don't send the authority-
               | issued ID to anyone, as that would reveal too much.
               | Instead, on your own device you generate unique,
               | encrypted proofs that say you possess an ID meeting the
               | age requirement. You generate these as often as you like
               | for different sites, and they cannot be correlated among
               | sites. These are called zero-knowledge proofs.
               | 
               | They work for other things than age too. For example, to
               | show you are an approved investor, or have had specific
               | healthcare or chemical safety training, or possess a
               | certain amount of credit without revealing how much, or
               | are citizen with voting rights, or are a shareholder with
               | voting rights, without revealing anything else about who
               | you are.
        
               | hn_acker wrote:
               | Do you mean that I can get a permanent age-verification
               | key from the third-party authority, then never have to
               | contact the authority again (unless I want a new key)? If
               | so (and assuming that zero knowledge proofs, which I'm
               | not very familiar with, work), then my privacy concerns
               | are resolved. (Well, I don't want the authority to keep a
               | copy of my verification key, but FOSS code and client-
               | side key generation should be feasible.)
        
               | tzs wrote:
               | An example of the kind of token transformation I'm
               | thinking of follows.
               | 
               | Assume RSA signatures from the site that looks at your ID
               | having public key (e,m) where e is the exponent and m is
               | the modules, and private key d. The signature s of a blob
               | of data, b, that you give them is b^d mod m.
               | 
               | To verify the signature one computes s^e mod m and checks
               | if that matches b.
               | 
               | Here's the transformation. You generate a random r from 1
               | to m-1 such that r is relatively prime to m. Compute r'
               | such that r r' = 1 mod m.
               | 
               | Instead of sending b to be signed, send b r^e mod m.
               | 
               | The signature s of b r^e is (b r^e)^d mod m = b^d r mod
               | m.
               | 
               | You take that signature and multiply by r'. That gives
               | you b^d mod m. Note that this is the signature you would
               | have gotten had you sent them b to sign instead of b r^e.
               | 
               | Net result: you've obtained the signature of b, but the
               | signing site never saw b. They just saw b r^e mod m.
               | 
               | That gives them no useful information about b, due to r
               | being a random number that you picked (assuming you used
               | a good random number generator!).
               | 
               | For any possible b, as long as it is relatively prime to
               | m, there is some r that would result in b r^e having the
               | same signature as your b, so the signing site has no way
               | to tell which is really yours.
               | 
               | b is unlikely to not be relatively prime to m. If m is
               | the product of two large primes, as is common, b is
               | relatively prime to it unless one of those primes divides
               | b. We can ensure that b is relatively prime to m by
               | simply limiting b to be smaller than either of the prime
               | factors of m. Since those factors are likely to each be
               | over a thousand bits this is not hard. In practice b
               | would probably be something like a random 128 bits.
        
           | ip26 wrote:
           | Third party, perhaps. You sign up for service A. Service A
           | queries service B, which knows who you are and provides a
           | one-time ack of your age.
        
             | tnbp wrote:
             | That sounds nightmarish. I don't want the verifier to know
             | what porn sites I visit. Someone else proposed the
             | following system to me: a third party authority issues a
             | certificate which I can then use to prove I'm 18. The CA
             | cannot see where I use the certificate, though.
        
               | Tadpole9181 wrote:
               | Er... The CA needs to be used to verify the certificate
               | by the third party, ergo it will know the websites.
               | 
               | It's virtually impossible to make a verification system
               | that's anonymous. _Somewhere_ the third party and
               | authenticator will need to share a secret that you cannot
               | touch.
               | 
               | Furthermore, you would need the government to agree to
               | this system and mandate this system universally and pay
               | for the authentication services to exist. That's not what
               | Florida is doing.
        
         | zo1 wrote:
         | While people are on the fence about it, our children are having
         | their youth, innocence and brains destroyed by tiktok et al.
         | Those platforms are cancer to adults even, let alone
         | impressionable kids... yet here we are still debating it and
         | faffing around about "1st amendment yaddi yadda".
        
           | Clubber wrote:
           | >children are having their youth, innocence and brains
           | destroyed by tiktok
           | 
           | For one, ease up on the hyperbole if you want to be taken
           | seriously. I'll give you the benefit of the doubt because the
           | news is nothing but hyperbole these days, so it's easy to
           | pick up the habit. Second, most kids aren't having "their
           | youth, innocence and brains destroyed." The news takes the
           | edge cases, amplifies them and presents it as the norm to
           | peddle fear because fear sells. Nothing ever is bad as the
           | news makes it out to be, but they gotta make a dollar, you
           | see how bad the news business is since the internet?
           | 
           | FWIW, my kid uses social media and just connects with her
           | friends. Nothing overly malicious goes on, they just goof
           | off. I've checked.
           | 
           | You really wanna protect the kids from anxiety and whatnot,
           | block the news and all the talking heads trying to manipulate
           | the next generation to their political opinions.
        
             | zo1 wrote:
             | I exaggerate for a reason, because I feel strongly about it
             | and think that it does do those things in a literal sense.
             | 
             | Besides, I've consumed tiktok and seen the content. It is
             | nowhere near appropriate for children, it's night and day.
        
             | light_hue_1 wrote:
             | Don't minimize what's going on.
             | 
             | There's a massive rise in depression in young children. The
             | teen suicide rate has almost doubled.
             | 
             | The idea that you know what's going on, on their social
             | media is pretty funny. Certainly what every adult always
             | assumed about me. And now that I have kids, I can see how
             | easily other kids fool their parents all the time.
             | 
             | And what's overly malicious? It may be social media itself
             | without anything bad driving this. Merely seeing a
             | sanitized version of people's lives over and over again,
             | without anyone bullying you, that leads to depression
             | because your life isn't as good.
             | 
             | No. Don't block the news. Because then you miss important
             | things like this.
             | https://www.hhs.gov/surgeongeneral/priorities/youth-
             | mental-h...
        
               | Clubber wrote:
               | I think if you wanted to reduce teen suicide a
               | significant amount, banning social media isn't going to
               | do it. It certainly isn't responsible for half. Of course
               | banning it doesn't cost the government any money, so it's
               | top of the list as opposed to any real solutions.
               | 
               | You also cherry picked your stats. If you open a larger
               | window, the current teen suicide rate is not as abnormal
               | as you are making it out to be.
               | 
               | https://www.cdc.gov/mmwr/volumes/66/wr/mm6630a6.htm
               | 
               | Perhaps all the fear mongering is having an effect on
               | their mental state, no? Teens don't yet realize that most
               | of the world is phony bullshit.
        
               | medvezhenok wrote:
               | Umm, that chart is 10 years out of date - it ends in
               | 2015; the beginning of the social media era.
               | 
               | The current teen suicide rate is ~62 / 100K, which is
               | just about double (or triple!) the last value in that
               | chart. And is also an anomaly over the last 40 years.
        
             | ditto664 wrote:
             | > For one, ease up on the hyperbole if you want to be taken
             | seriously.
             | 
             | I disagree that this is hyperbole. It's a huge problem
             | among kids. Literacy rates are dropping. Listen to the
             | stories you hear from teachers.
        
               | mrguyorama wrote:
               | Teachers have ALWAYS made those screams. My mother, a
               | french teacher always complained that before she could
               | teach kids french, she had to teach them how to read a
               | clock, how to do math, and how the days of the week work
               | (these were fifth graders mind you). She blamed education
               | policy, but this is nothing more than what happens when
               | 30% of your students are in poverty.
               | 
               | The reality is that some percentage of students will
               | always fall through the cracks, and the human brain loves
               | to blame whatever is "new" for problems that are "new"
               | _to you_. This has been a problem for teachers since at
               | least the No Child Left Behind policy, and even goes as
               | far back as Socrates bemoaning his students being
               | terrible because books meant they didn 't have to have
               | perfect memories.
               | 
               | Students suffered because covid was both a huge
               | disruption to their education, and parents freaked out
               | instead of trying to handle it (and plenty of people
               | literally could not handle it anyway). It doesn't help
               | that half the country openly cries that education is
               | nothing more than liberal indoctrination, and openly
               | downplay the value of even basic education, like the
               | three Rs, and claims that anything higher than a high
               | school education is also liberal indoctrination, is
               | "woke", and is valueless.
               | 
               | I 100% hate tiktok, but I don't think it is (currently)
               | being used to mentally attack the US. Maybe someday if we
               | are ever at war with China, but right now they are
               | content believing that inclusivity is toxic on it's own.
               | I don't think tiktok changes people's brain
               | significantly. I do think it is extremely low value way
               | to spend time, and that it is addictive, two serious
               | issues when taken together, but then again I spent my
               | life watching several hours of TV a day. I especially
               | don't like how tiktok seems to purposely direct new male
               | users to what is basically softcore porn.
        
             | dang wrote:
             | > _For one, ease up on the hyperbole if you want to be
             | taken seriously._
             | 
             | That is the kind of swipe the HN guidelines ask people to
             | edit out of their comments
             | (https://news.ycombinator.com/newsguidelines.html: " _Edit
             | out swipes._ "), so please don't post like this.
             | 
             | Your comment would be fine without that sentence and the
             | one after it.
             | 
             | (I'm not saying the GP comment is particularly good--it was
             | pretty fulminatey--but it wasn't quite over the line,
             | whereas yours was, primarily because yours got personal.)
        
         | insane_dreamer wrote:
         | > government doing a parent's job
         | 
         | the govt already set the bar at 13, so what's different about
         | setting it at 16?
        
         | mbitsnbites wrote:
         | > government doing a parent's job
         | 
         | The problem here is that it's pretty much out of the hands of
         | the parents. If your kids' friends have social media, your kids
         | will absolutely need it too in order to not be left out. I've
         | witnessed the pressure, and it's not pretty. Add to that the
         | expectation from society that children shall have access to
         | social media.
         | 
         | Regulation is pretty much the only way to send the right
         | signals to parents, schools, media companies (e.g. Swedish
         | public service TV has a kids app that until recently was called
         | "Bolibompa Baby", but it's now renamed to "Bolibompa Mini"),
         | app designers, and so on.
        
         | safog wrote:
         | We're barreling towards an internet that requires an id before
         | you can use it.
         | 
         | It's a bit upsetting but I don't harbor the early 2000s naivete
         | about the free internet where regulation doesn't exist, the
         | data exchange happens over open formats and connecting people
         | from across the world is viewed as an absolute positive.
         | 
         | Govt meddling on social media platforms, the filter bubble,
         | platforms locking data in, teenage depression stats post
         | Instagram, doom scrolling on tiktok have flipped me the other
         | way.
         | 
         | Internet Anonymity is going to die - let's see if that makes
         | this place any better.
        
           | Tadpole9181 wrote:
           | > Govt meddling on social media platforms
           | 
           | And the government having unfettered knowledge of every site
           | you visit - in particular the more salacious ones - is how we
           | solve that? Surely that won't be used as a political cudgel
           | to secure power at any point, nor will it ever be used to
           | target specific demographics or accidentally get leaked.
        
         | soared wrote:
         | https://developers.google.com/privacy-sandbox/protections/pr...
         | 
         | Private State Tokens enable trust in a user's authenticity to
         | be conveyed from one context to another, to help sites combat
         | fraud and distinguish bots from real humans--without passive
         | tracking.
         | 
         | An issuer website can issue tokens to the web browser of a user
         | who shows that they're trustworthy, for example through
         | continued account usage, by completing a transaction, or by
         | getting an acceptable reCAPTCHA score. A redeemer website can
         | confirm that a user is not fake by checking if they have tokens
         | from an issuer the redeemer trusts, and then redeeming tokens
         | as necessary. Private State Tokens are encrypted, so it isn't
         | possible to identify an individual or connect trusted and
         | untrusted instances to discover user identity.
        
           | Tadpole9181 wrote:
           | This system clearly and trivially deanonymizes the internet.
           | Even worse than a centralized system, it uses a simple "just
           | trust me bro" mentality that issuers would _never_ injure
           | users for personal gain and would _never_ keep logs or have
           | data leaks, which would expose the Internet traffic of a real
           | person.
        
         | KaiserPro wrote:
         | > I'm still on the fence about government doing a parent's job
         | here
         | 
         | The issue is, as a parent who is not very technical, how do
         | they _safely_ audit their child's social media use?
         | 
         | I am reasonably confident that I could control my kid's social
         | media habit, but only up to a point. there isn't anything
         | really stopping them getting their own cheap phone/signing in
         | on another person's machine.
         | 
         | The problem is, to safely stop kids getting access requires
         | either strong authentication methods to the ISP. ie, to get an
         | IP you need 2fa to sign in. But thats also how censorship/de-
         | anonymisation happens.
        
       | bedhead wrote:
       | Not sure how I feel about this, but I don't hate it,
       | theoretically. I'm sure it's hopeless in practice. It might be a
       | worthwhile experiment if nothing else. But a piece of legislation
       | will never be an adequate substitute for good parenting.
        
         | stronglikedan wrote:
         | I'd like to see more science proving that social media is bad
         | for kids before I see lawmakers enacting laws around the
         | theory. That said, I think 16 is too high an age. Kids go into
         | high school at ~13, so I think that would have been a more
         | reasonable age to consider. The hard truth is that most kids
         | don't seem to have access to "good parenting" by any reasonable
         | standard, and parenting is hard with good parenting being
         | harder (not an excuse).
        
           | brightball wrote:
           | I'd imagine that 16 is the age when most kids finally get a
           | state ID.
        
           | dmm wrote:
           | > I'd like to see more science proving that social media is
           | bad for kids before I see lawmakers enacting laws around the
           | theory
           | 
           | Is there science proving that porn is harmful for children?
           | If not, do you see that as an argument for legalizing it?[0]
           | 
           | The first restrictions on tobacco purchases for children were
           | in the 1880s, long before the science was settled on the
           | harms tobacco use causes.
           | 
           | Science is slow, contentious, and produces limited
           | results[1]. If we can only ban things that are scientifically
           | proven to be harmful, what is stopping TikTok from slightly
           | modifying their app and rereleasing it?
           | 
           | I can easily buy products that are BPA free but that just
           | means clever chemists have come up with even worse
           | platicizers like BPS to use as substitutes.
           | 
           | And software can be adapted way faster than chemistry.
           | 
           | [0] I don't. [1] It's definitely still worth studying.
        
           | mbitsnbites wrote:
           | > I'd like to see more science proving that social media is
           | bad for kids
           | 
           | Smoking tobacco was allowed for children for hundreds of
           | years before it was regulated. Today we can hardly fathom how
           | stupid it would be to allow children to smoke. Social media
           | is much more accessible than cigarettes, far more addictive,
           | and rather than messing with your lungs it messes with your
           | brain and personality.
           | 
           | There are many problems with "waiting for the science". One
           | is that it takes many, many decades to get reliable
           | longitudinal studies on things like addiction and how the
           | brain is affected.
           | 
           | There are so many indications that social media is bad for
           | children (including scientific studies), in many different
           | ways. It really should not be controversial to limit use for
           | children. It's not like it is something that they need.
           | 
           | This book, "Smartphone brain: how a brain out of sync with
           | the times can make us stressed, depressed and anxious" by
           | Swedish MD psychiatrist Anders Hansen, brings up one of the
           | aspects (unfortunately I don't know of an English
           | translation): https://www.amazon.se/Sk%C3%A4rmhj%C3%A4rnan-
           | hj%C3%A4rna-str...
           | 
           | Here's a short video with Anders Hansen:
           | https://youtu.be/DwAx2kRwCRI?t=112
           | 
           | Then there's is of course the question of _how_ to limit the
           | use. But that 's another issue.
        
         | M4v3R wrote:
         | Definitely not as a substitute, but it might be something that
         | helps push more parents to consider preventing their children
         | from using social media. It's much easier for a parent to
         | explain to the child that the reason they're not allowed to use
         | it is because it's illegal, instead of trying to explain how
         | social media negatively affects their brains.
        
       | lsiunsuex wrote:
       | As a Floridian and someone in IT - I'm curious how this will be
       | implemented
       | 
       | I can't remember the last time I signed up for a new social
       | network; do they ask age? Is it an ask to Apple / Google to add
       | stronger parental approval? Verify drivers license #?
       | 
       | We heard about this days ago on local news and I've been
       | struggling to figure out short of are you 16 years or older how
       | this is gonna get done and how do you fine someone if it's
       | breached.
        
         | brightball wrote:
         | My guess is that it will be most easily enforced in school.
         | After school is another story entirely.
        
         | thsksbd wrote:
         | (Putting aside if the law is good or bad and the
         | constitutionality of it.)
         | 
         | Put criminal penalties to the directors if no reasonable
         | attempt to keep kids out.
         | 
         | Plus corporate death penalty if they purposely target kids.
         | 
         | Then how they enforce it doesn't really matter as long as there
         | are periodic investigations. The personal risks are too great
         | and the companies will figure it out.
        
           | djfdat wrote:
           | Excessive punishment with arbitrary enforcement? What could
           | go wrong?
        
             | a_victorp wrote:
             | You forgot selective punishment as well
        
           | throwaway-blaze wrote:
           | The FTC already implements a "corporate death penalty" in the
           | form of massive fines if an organization collects data on
           | kids and uses it to target advertising (see COPPA)
        
         | alistairSH wrote:
         | Yes, some (all?) ask for DOB/age at sign-up.
         | 
         | If I remember correctly, at one time Google even tried to
         | enforce it and there were usability problems with typos and
         | wrong dates and things - there was no verification and no easy
         | way to fix an error. IE, if a mid-40s adult accidentally
         | entered 1/1/2024, they'd be locked out. And if a kid entered
         | 1/1/1977, they'd have an account (but not way to correct that
         | date when they eventually turned 18).
        
         | AlecSchueler wrote:
         | Yes, they always ask your date of birth and generally won't
         | allow sign ups for under 13s, it's been that way for almost 20
         | years.
        
           | internet101010 wrote:
           | Yep. Twitch automatically bans any chatter who says that they
           | are under that age.
        
         | WarOnPrivacy wrote:
         | > I'm curious how this will be implemented
         | 
         | The only way to determine age is to compile a database of gov-
         | issued IDs and related data. Which is an unconstitutional
         | barrier to speech. Which is why this will get struck-down like
         | each similar law.
         | 
         | The part about ID data eventually being shared with 3rd
         | parties, agencies - and/or leaked - is a bonus.
        
           | tzs wrote:
           | It sounds like you are envisioning age verification that
           | involves just two parties: the user and the site that they
           | need to prove their age to. The user shows the site their
           | government issued ID and the site uses the information on the
           | ID to verify the age.
           | 
           | That would indeed allow the site to compile a database of
           | government issues IDs and give that information (willfully or
           | via leaks) to third parties.
           | 
           | Those issues can be fixed by using a three party system. The
           | parties are the user, the site that they need to prove their
           | age to, and a site that already has the information from the
           | user's government ID.
           | 
           | Briefly, the user gets a token from the social media site,
           | presents that token and their government ID to the site that
           | already has their ID information, and that site sign that
           | token if the user meets the age requirements. The user
           | presents that signed token back to the social network which
           | sees that it was signed by the third site which tells it the
           | third site says the user meeds the age requirement.
           | 
           | By using modern cryptographic techniques (blind signatures or
           | zero knowledge proofs) the communication between the user and
           | the third site can be done in a way that keeps the third site
           | from getting any information about which site they are doing
           | the age check for.
           | 
           | With some additional safeguards in the protocol and in what
           | sites are allowed to be the ID checking sites it can even be
           | made so that someone who gets records of both the social
           | media site and the third site can't use timing information to
           | match up social media accounts with verifications and so
           | could work with sites that allow anonymous accounts.
        
             | bilsbie wrote:
             | I don't think anyone in government is smart enough to
             | enable or allow this.
        
               | throwaway-blaze wrote:
               | That is literally how the age verification for porn works
               | in Louisiana and Virginia among other states.
        
             | hn_acker wrote:
             | > With some additional safeguards in the protocol and in
             | what sites are allowed to be the ID checking sites it can
             | even be made so that someone who gets records of both the
             | social media site and the third site can't use timing
             | information to match up social media accounts with
             | verifications and so could work with sites that allow
             | anonymous accounts.
             | 
             | I'm assuming that there will be some kind of way to prevent
             | matching of logged IP addresses between the social media
             | site and the verification site. Is there really a method
             | for preventing matches of timing without requiring the user
             | to bear the burden of requesting tokens from the sites at
             | different times?
             | 
             | As I hinted at in a different comment [1] though, there
             | remains a tradeoff of letting the verification party know
             | how frequently I visit a single type of website vs.
             | avoiding the first problem but needing my ID for multiple
             | types of websites i.e. more of the internet.
             | 
             | [1] https://news.ycombinator.com/item?id=39180203
        
             | WarOnPrivacy wrote:
             | > It sounds like you are envisioning age verification that
             | involves just two parties: the user and the site that they
             | need to prove their age to. ... Those issues can be fixed
             | by using a three party system.
             | 
             | Okay. That sounds promising.
             | 
             | However the method of collecting childrens' private data
             | isn't what makes these laws unconstitutional. It's a
             | government erecting broad, restrictive barriers to speech.
             | 
             | ref: https://reason.com/2023/09/19/federal-judge-blocks-
             | californi...
             | 
             | ref: https://www.theverge.com/2023/8/31/23854369/texas-
             | porn-age-v...
             | 
             | ref: https://www.techdirt.com/2023/09/13/you-cant-wish-
             | away-the-1...
             | 
             | ref: http://mediashift.org/2009/01/u-s-supreme-court-
             | finally-kill...
             | 
             | ref: https://netchoice.org/district-court-halts-
             | unconstitutional-...
             | 
             | Utah caught a glimpse or reality and stayed their own
             | unconstitutional law. They seem to looking for a way to
             | retool it so it won't be quite so trivial to strike down.
             | 
             | ref: https://kslnewsradio.com/2073740/utahs-social-media-
             | child-pr...
        
       | newsclues wrote:
       | Government needs modern digital first ID/authentication services.
        
         | alistairSH wrote:
         | Probably, but that's a tough nut to crack. A significant chunk
         | of the American public is insanely anti-federal ID.
        
           | newsclues wrote:
           | Doesn't need to be federal, because it's digital it can be
           | distributed and portable instead of centralized.
           | 
           | If it respects privacy, is secure and convenient and
           | optional, people would love it.
        
             | alistairSH wrote:
             | That's true, however I'm not sure I trust "Big Tech" to
             | self-manage this system. After all, "Big Tech" are the ones
             | that forced us into needing this legislation in the first
             | place. And they don't have a great track record at
             | protecting PII.
             | 
             | The federal government already has all the info it needs to
             | run an ID program.
             | 
             | But, I'd be open to debate on who should do it.
        
         | mindslight wrote:
         | First, we need privacy regulation (eg a US port of the GDPR)
         | that stops the existing widespread abuses of identification and
         | personal data, especially the abuses being facilitated by the
         | current identification systems. Only after this is fixed does
         | it make sense to talk about increasing the technical strength
         | of identification.
        
       | tzs wrote:
       | > The social media platforms the bill would target include any
       | site that tracks user activity, allows children to upload content
       | or uses addictive features designed to cause compulsive use.
       | 
       | That does not appear to be correct. It says it applies if any of
       | 3 conditions hold. The bill text says all of the conditions must
       | hold (and there are 5, not 3). Here is what the current text says
       | social media platform means:
       | 
       | < Means an online forum, website, or application offered by an
       | entity that does all of the following:
       | 
       | < a. Allows the social media platform to track the activity of
       | the account holde
       | 
       | < b. Allows an account holder to upload content or view the
       | content or activity of other account holders.
       | 
       | < c. Allows an account holder to interact with or track other
       | account holders.
       | 
       | < d. Utilizes addictive, harmful, or deceptive design features,
       | or any other feature that is designed to cause an account holder
       | to have an excessive or compulsive need to use or engage with the
       | social media platform.
       | 
       | < e. Allows the utilization of information derived from the
       | social media platform's tracking of the activity of an account
       | holder to control or target at least part of the content offered
       | to the account holder.
       | 
       | There's also a huge list of exceptions. It says that it:
       | 
       | < Does not include an online service, website, or application
       | where the predominant or exclusive function is:
       | 
       | < a. Electronic mail.
       | 
       | < b. Direct messaging consisting of text, photos, or videos that
       | are sent between devices by electronic means where messages are
       | shared between the sender and the recipient only, visible to the
       | sender and the recipient, and are not posted publicly.
       | 
       | < c. A streaming service that provides only licensed media in a
       | continuous flow from the service, website, or application to the
       | end user and does not obtain a license to the media from a user
       | or account holder by agreement to its terms of service.
       | 
       | < d. News, sports, entertainment, or other content that is
       | preselected by the provider and not user generated, and any chat,
       | comment, or interactive functionality that is provided incidental
       | to, directly related to, or dependent upon provision of the
       | content.
       | 
       | < e. Online shopping or e-commerce, if the interaction with other
       | users or account holders is generally limited to the ability to
       | upload a post and comment on reviews or display lists or
       | collections of goods for sale or wish lists, or other functions
       | that are focused on online shopping or e-commerce rather than
       | interaction between users or account holders.
       | 
       | < f. Interactive gaming, virtual gaming, or an online service,
       | that allows the creation and uploading of content for the purpose
       | of interactive gaming, edutainment, or associated entertainment,
       | and the communication related to that content.
       | 
       | < g. Photo editing that has an associated photo hosting service,
       | if the interaction with other users or account holders is
       | generally limited to liking or commenting.
       | 
       | < h. A professional creative network for showcasing and
       | discovering artistic content, if the content is required to be
       | non-pornographic.
       | 
       | < i. Single-purpose community groups for public safety if the
       | interaction with other users or account holders is generally
       | limited to that single purpose and the community group has
       | guidelines or policies against illegal content.
       | 
       | < j. To provide career development opportunities, including
       | professional networking, job skills, learning certifications, and
       | job posting and application services.
       | 
       | < k. Business to business software.
       | 
       | < l. A teleconferencing or videoconferencing service that allows
       | reception and transmission of audio and video signals for real
       | time communication.
       | 
       | < m. Shared document collaboration.
       | 
       | < n. Cloud computing services, which may include cloud storage
       | and shared document collaboration.
       | 
       | < o. To provide access to or interacting with data visualization
       | platforms, libraries, or hubs.
       | 
       | < p. To permit comments on a digital news website, if the news
       | content is posted only by the provider of the digital news
       | website.
       | 
       | < q. To provide or obtain technical support for a platform,
       | product, or service.
       | 
       | < r. Academic, scholarly, or genealogical research where the
       | majority of the content that is posted or created is posted or
       | created by the provider of the online service, website, or
       | application and the ability to chat, comment, or interact with
       | other users is directly related to the provider's content.
       | 
       | < s. A classified ad service that only permits the sale of goods
       | and prohibits the solicitation of personal services or that is
       | used by and under the direction of an educational entity,
       | including:
       | 
       | < (I) A learning management system;
       | 
       | < (II) A student engagement program; and
       | 
       | < (III) A subject or skill-specific program.
       | 
       | I hope they add 8 more exceptions. I want to see what they do
       | when they run out of letters for labeling the exceptions.
        
         | insane_dreamer wrote:
         | a lot of loopholes there
        
       | causal wrote:
       | I might be the only one here in favor of this, and wanting to see
       | a federal rollout.
       | 
       | It is not reasonable to expect parents to spontaneously agree on
       | a strategy for keeping kids off social media- and that kind of
       | coordination is what it would take, because the kids + social
       | media companies have more than enough time to coordinate
       | workarounds. Have the law put the social media companies on the
       | parents side, or these kids may never be given the chance to
       | develop into healthy adults themselves.
        
         | andy99 wrote:
         | I'm in favor of kids not using social media, but not of the
         | government forcing this on people nor spinning up whatever
         | absurd regulatory regime is required. And the chance of
         | actually enforcing it is zero anyway. It's no more realistic to
         | expect this to work than to expect all parents to do it as you
         | say. It's just wasted money plus personal intrusion that won't
         | achieve anything.
        
           | soco wrote:
           | Is there an alternative? Self-control - as we have now -
           | brought us here. If the government shouldn't step in then the
           | only other option left (only I can see) is magic. And we have
           | a bad record with magic.
        
           | JoshTko wrote:
           | I used to want no govt intrusion for this. Then I understood
           | how there are teams of PHDs tweaking feed to maximize
           | addition at each social network. I think there could be even
           | limits, or some sort of tax on gen pop.
        
           | matthewdgreen wrote:
           | Several governments have already effectively banned sites
           | like Pornhub by creating regimes where people have to mail
           | their ID to a central clearinghouse (which creates a huge
           | chilling effect.) The article talks about "reasonable age
           | verification measures" and so saying it's unenforceable seems
           | a little bit premature. Also, you can bet those measures
           | won't be in any way reasonable once the Florida legislature
           | gets through with them.
        
             | rschneid wrote:
             | >effectively banned
             | 
             | In my opinion, these governments haven't implemented
             | 'effective' bans (though maybe chilling, as you say) but
             | primarily created awkward new grey markets for the personal
             | data that these policies rely on for theatrics. Remember
             | when China 'banned' youth IDs from playing online games
             | past 10PM? I think a bunch of grandparents became gamers
             | around the same time...
             | 
             | https://www.nature.com/articles/s41562-023-01669-8
        
               | seanw444 wrote:
               | Which is exactly what happens for markets that are
               | desirable enough. We compare bans of things not enough
               | people care about, to bans of things that people are
               | willing to do crazy things for. They don't yield the same
               | results.
        
               | monkeynotes wrote:
               | No policy is 100% effective. Kids still get into alcohol,
               | but the policy is sound.
        
               | rft wrote:
               | Another example is some Korean games requiring
               | essentially a Korean ID to play. A few years ago there
               | was a game my guild was hyped about and we played the
               | Korean version a bit. You more or less bought the
               | identity of some Korean guy via a service that was
               | explicitly setup for this. Worked surprisingly well and
               | was pretty streamlined.
        
               | angra_mainyu wrote:
               | At least from personal experience, when there was a
               | period where my ISP in the UK started requiring ID
               | verification for porn, I literally ceased to watch it.
               | 
               | Making something difficult to do actually works to _curb_
               | behavior.
        
             | 55555 wrote:
             | You're proving the argument that the parent set forth.
             | Anyone who wants to visit Pornhub can just visit one of the
             | many sites that isn't abiding by the new law. However,
             | that's not due to a lack of legislation, but rather a lack
             | of enforcement, or, perhaps, enforceability. If laws always
             | worked I'd be for more of them. My argument is not that we
             | should never make laws because it's futile but rather that
             | some laws are more futile than others and having laws go
             | unenforced weakens government and enforcing them
             | inequitably is injust.
        
               | monkeynotes wrote:
               | Also social policy enforcement is a generational thing.
               | The UK is only just getting toward outright banning
               | cigarettes by making it illegal for anyone born after X
               | date from ever buying them. Eventually you have a whole
               | generation that isn't exposed to smoking and on the whole
               | thinks the habit is disgusting, which it is.
        
               | verall wrote:
               | Except that some people born after that date will still
               | acquire them, get addicted, and then what? Prosecute them
               | like drug possession?
               | 
               | It's infantilizing and dumb. Grown adults should be
               | allowed to smoke tobacco if they so wish and smoking
               | rates are already way down due to marketing and
               | alternatives. Noone needs to be prosecuted.
        
               | zhivota wrote:
               | You don't need to prosecute any buyers at all though. All
               | you need to do is make it illegal to sell in shops, and
               | illegal to import. There will be a black market, sure,
               | but how many people are going to go through the trouble
               | and expense to source black market tobacco? Not that
               | many. And everyone benefits because universal healthcare
               | means everyone shares the cost of the health effects that
               | are avoided.
        
               | monkeynotes wrote:
               | Should mention the govt. has to find budget to fill the
               | gap of tobacco budget, but they've been slowly doing this
               | as demand has slumped since 2008.
        
               | badpun wrote:
               | > how many people are going to go through the trouble and
               | expense to source black market tobacco? Not that many.
               | 
               | Just see how many people already go through that trouble
               | to source illegal drugs...
        
               | monkeynotes wrote:
               | I think it's hyperbolic to look at tobacco like other
               | drugs. Tobacco is a lifestyle thing, it doesn't get you
               | high, it's a cultural habit. There are only upsides to
               | getting rid of the social demand for it.
               | 
               | If you think taking tobacco away from consumers is
               | infantilizing, why yes, yes it is. We are dealing with
               | children's futures. Adults get to continue smoking,
               | children less likely to even want to smoke as the social
               | acceptance goes down and with that there is less and less
               | desire to smoke. Nicotine doesn't do much other than get
               | you addicted, no one is chasing after a pronounced high
               | with it, people start smoking because it's perceived as
               | cool.
               | 
               | I can't imagine an adult wanting to start smoking, most
               | adults get addicted in their teens.
               | 
               | I think you have have an import ban, and a black market,
               | and still see significant gains in eroding the demand. I
               | do not think people should be prosecuted for possession,
               | but the UK will probably make some bad decisions there,
               | but that doesn't mean the overall policy is bad.
        
             | Taylor_OD wrote:
             | Does this actually work or does it just push those same
             | people to sketchier websites?
        
               | angra_mainyu wrote:
               | Worked in my case when my ISP required ID for it in the
               | UK.
               | 
               | I just noped out of it entirely.
        
             | __turbobrew__ wrote:
             | My main issue is that the only effective way to ban access
             | to a website is to also ban VPNs and any sort of network
             | tunneling. A great firewall would have to be constructed
             | which I am very much against. Even China's firewall is
             | surpassable and it is questionable how much it is worth
             | operating given the massive costs which would be incurred.
             | 
             | I think the government should invest in giving parents the
             | tools to control their child's online access. Tools such as
             | DNS blocklists, open source traffic filtering software
             | which parents could set up in their home, etc.
        
           | OJFord wrote:
           | We don't even have to speculate, isn't it already the case
           | for <13yo? Or is just Europe? Anyway - yeah, of course
           | they're still on it. Expect less compliance and harder
           | enforcement the older they are, not more/easier.
        
             | samtho wrote:
             | The only protection in the Us is technically "collection of
             | personal information" via COPPA[0] which you can argue
             | would kneecap social media. Any parent can provide consent
             | for their child, however. Children themselves can also just
             | click the button that says they are over 13 if it gets them
             | what they want.
             | 
             | [0]: https://www.ftc.gov/legal-
             | library/browse/rules/childrens-onl...
        
           | loceng wrote:
           | So, maybe it a law - but requiring parents-guardians to
           | enforce it, and not the state - sounds like a reasonable
           | inbetween?
        
           | yew wrote:
           | There are degrees of enforceability.
           | 
           | When I was first getting online, the expectation was that you
           | at least had to be bright enough to lie about your age. Now I
           | have to occasionally prune my timeline after it fills up with
           | "literally a minor." Even an annoyance tax might have some
           | positive effect. Scare the pastel death-threats back into
           | their hole...
        
           | arcticbull wrote:
           | It seems like you are in favor of something that requires
           | coordination, but don't believe in coordination. Is there a
           | different way you think this could be achieved?
        
             | kelseyfrog wrote:
             | I don't believe in the disbelief of coordination so it
             | seems like we're at a bit on an impasse. Please expound.
        
           | ajhurliman wrote:
           | We have a ban on gambling for minors, so if you see social
           | media as more harmful than gambling (personally, I do) it
           | probably makes sense.
        
             | monkeynotes wrote:
             | Gambling is outright banned for a majority of regions in
             | the US, not just kids. I don't think they are equally bad,
             | just different bad. Gambling is addictive, and it destroys
             | people. Social media is addictive and socially toxic, on
             | the whole it erodes the very fabric of a society.
        
               | ribosometronome wrote:
               | >Social media is addictive and socially toxic, on the
               | whole it erodes the very fabric of a society.
               | 
               | It's interesting how every generation seems to decry new
               | forms of media as eroding the fabric of society. They
               | said it about video games, television, music, movies,
               | etc. I'm sure we're right this time, though.
        
               | monkeynotes wrote:
               | Yeah, none of those are comparable. TV never lead to
               | children bullying each other anonymously which then leads
               | to kids committing suicide.
        
               | bee_rider wrote:
               | Television clearly did _something_ pretty significant to
               | society. Cable news makes the country more polarized,
               | that we didn't do anything about it doesn't mean it
               | wasn't a problem. We just missed and now the problem is
               | baked into the status quo.
               | 
               | Video games are typically fiction, so the ability to pass
               | propaganda through them is usually a little more limited.
               | It isn't impossible, just different.
               | 
               | Social media is a pretty bad news source for contentious
               | issues. We should be pretty alarmed that people are
               | getting their news there.
        
               | angra_mainyu wrote:
               | Apples-to-oranges comparison.
               | 
               | Plenty has been written already on the ravaging effects
               | of social media on society and it's pretty plain to see.
               | 
               | https://www.mcleanhospital.org/essential/it-or-not-
               | social-me...
               | 
               | > Rates of sexual activity have been in decline for
               | years, but the drop is most pronounced for adults under
               | age 25 (Gen Z).
               | 
               | > For Gen Z, a rise in sexlessness has coincided with a
               | decline in mental health.
               | 
               | > Sexual activity can boost mood and relieve stress and
               | may serve as a protective factor against anxiety and
               | depressive disorders.
               | 
               | https://www.psychologytoday.com/intl/blog/the-myths-
               | sex/2022... missing-out-the-benefits-sex
        
               | lumb63 wrote:
               | ... we've _been_ right. Television, music, video games,
               | movies, etc., have decimated social capital and
               | communication in the western world. It is not uncommon to
               | "hang out" with people who are your "friends" without
               | ever interacting with them in any meaningful way because
               | everyone is focused on, e.g., a television. That's
               | assuming everyone isn't too busy playing video games to
               | leave their houses.
               | 
               | Whether you agree or not that they're "eroding the very
               | fabric of a society" (I would argue they are), it should
               | be acknowledged that almost all the downsides predicted
               | have come to pass, and life has gone on not because these
               | things didn't happen, but in spite of them having
               | happened.
        
               | viraptor wrote:
               | People are interacting through online games quite often
               | though. You can't throw them all in one basket. You can
               | make friends / longer term connections that way (I did)
               | or keep in touch with people you don't live near to.
        
               | badpun wrote:
               | That's not community, though. You're not inhabiting the
               | same space, share the same problems and try to work them
               | out them together. You're not helping each other out if
               | someone falls into trouble.
        
               | mynameisash wrote:
               | It seems that all data on the subject of social media
               | point emphatically to, Yes! It's terrible for
               | adolescents[0] (and probably society writ large?).
               | 
               | [0] https://jonathanhaidt.com/social-media/
        
               | bcrosby95 wrote:
               | Social media isn't new. It's over 20 years old by now. If
               | you count things like forums you're looking at 30+ years
               | old.
        
               | rchaud wrote:
               | Not anymore. A 2018 Supreme Court decision opened the
               | floodgates to legalized sports gambling, much of it
               | online. The only states with existing bans are Hawaii and
               | Utah, which combined have only 5 million residents.
               | 
               | https://en.m.wikipedia.org/wiki/Murphy_v._National_Colleg
               | iat...
        
               | monkeynotes wrote:
               | Oof.
        
               | alexb_ wrote:
               | This is just blatantly not true, anyone who has tried to
               | gamble on sports can tell you that companies go to quite
               | incredible lengths to make sure that nobody outside of
               | the few jurisdictions where they're legal can gamble
               | online. I live in Nebraska, and I have to go travel
               | across to Iowa before I can do anything.
        
               | ajhurliman wrote:
               | True, it's state by state, but a lot of states allow it.
               | And even if you're not in the right state you can always
               | go to sites from other countries (e.g. betonline.ag)
        
             | WindyLakeReturn wrote:
             | The ban we have on gambling seems weak. From trading card
             | games to loot boxes to those arcade games that look to be
             | skill based but are entirely up to chance, children are
             | allowed to do all. The rules feel very inconsistent to the
             | point that they appear arbitrary in nature.
        
               | ajhurliman wrote:
               | Agreed, the gaming industry has done it's damndest to
               | undermine the restrictions of gambling.
        
           | incomingpain wrote:
           | I'm in favor of kids not using porn, but not of the
           | government forcing this on people nor spinning up whatever
           | absurd regulatory regime is required. And the chance of
           | actually enforcing it is zero anyway. It's no more realistic
           | to expect this to work than to expect all parents to do it as
           | you say. It's just wasted money plus personal intrusion that
           | won't achieve anything.
           | 
           | I only changed 2 words for 1.
        
           | ribosometronome wrote:
           | Social media's existence is predicated on their algorithms
           | being good at profiling you. Facebook's already got some
           | level of ID verification for names, where they'll
           | occasionally require people to submit IDs. No reason that
           | similar couldn't be applied to age if society agreed it was
           | worthwhile.
        
           | dylan604 wrote:
           | There is a societal problem that is beyond just parenting.
           | The peer pressure for kids to feel left out and ostracized
           | because they are the only ones _not_ on the socials is
           | something a teen is going to definitely rebel against their
           | parents on. It 's part of being a teen. I'm guessing the
           | other parents would even put pressure on the parents denying
           | the social access.
           | 
           | To me, the only way out of this is by changing one nightmare
           | for another giving the gov't the decision of allowing/denying
           | access. Human nature is not a simple thing to regulate since
           | the desire for that regulating is part of human nature
        
             | diputsmonro wrote:
             | Is talking to other people online really so bad that we
             | need the government to step in and tell us who we can and
             | can't talk to? How quickly will that power expand to what
             | we can and can't talk about?
             | 
             | I agree that neither solution is perfect, but exchanging an
             | imperfect but undoubtedly free system of communication for
             | one that is explicitly state-controlled censorship is an
             | obvious step backwards.
             | 
             | "Thinking about the children" should also involve thinking
             | about what kind of a society you want to build for them. A
             | cage is not the answer, especially not with fascism
             | creeping back into our politics.
        
               | dylan604 wrote:
               | I think you are willingly playing this down as "talking
               | to people online" to make some point. However, it is
               | beyond what one kid online says to another online. It is
               | what predators say to those kids online. I don't just
               | mean Chester and his panel van. I'm talking about anyone
               | that is attempting to manipulate that kid regardless of
               | the motive, they are all predators.
               | 
               | Social media has long since past just being a means of
               | communicating with each other, and you come across as
               | very disingenuous for putting this out there.
        
               | diputsmonro wrote:
               | I think people are being incredibly disingenuous when
               | they imagine that the government won't abuse this power
               | to censor and harm marginalized communities. Many states
               | are trying to remove LGBT books from school libraries for
               | being "pornographic" right now, for example. All it takes
               | is some fancy interpretations of "safety" and "social
               | media" for it to become a full internet blackout, for
               | fear of "the liberals" trying to "trans" their kids.
               | 
               | I don't deny that kids can get into trouble and find
               | shocking or dangerous things online. But kids can also
               | get in trouble walking down the street. We should not
               | close streets or require ID checks for walking down them.
               | Parents should teach their kids how to be safe online,
               | set up network blocks for particularly bad sites, and
               | have some kind of oversight for what their kids are
               | doing.
               | 
               | Maybe these bills should mandate that sites have the
               | ability to create "kid" accounts whose history can be
               | checked and access to certain features can be managed by
               | an associated "parent" account. Give parents better tools
               | for keeping their kids safe, don't just give the
               | government control over all internet traffic.
        
           | reaperducer wrote:
           | _I 'm in favor of kids not using social media, but not of the
           | government forcing this on people nor spinning up whatever
           | absurd regulatory regime is required._
           | 
           | People said the same thing about age restrictions for
           | smoking, alcohol, movies, and on and on and on.
           | 
           | It's not some unsolvable new problem just because it's the
           | ad-tech industry.
        
         | WarOnPrivacy wrote:
         | You can be in favor but in the US it is unconstitutional for a
         | gov to broadly restrict speech. It's why each of these age
         | verification + social media laws eventually get tossed.
         | Legislators know this (or are too dumb to) but it's not their
         | own dollars that are getting burned during this vote-baiting
         | performance.
        
           | bilsbie wrote:
           | The FCC seems to get around that pretty easily not allowing
           | nipples on Tv.
        
             | jcranmer wrote:
             | Only on broadcast TV, and the decision is fundamentally
             | reliant on the nature that RF spectrum is a finite resource
             | to be able to justify the restriction.
             | 
             | SCOTUS has routinely struck down prohibitions against the
             | same things in other media, including explicitly the
             | internet.
        
           | carabiner wrote:
           | Deprecating the Constitution is long overdue.
        
           | spogbiper wrote:
           | honest question, is an american child granted the right of
           | free speech? or do you get that right at a certain age?
        
         | brightball wrote:
         | I'm normally anti-regulation as well, but as a parent I'm fully
         | on board with this. The amount of peer pressure to be on social
         | media is insane.
        
           | WarOnPrivacy wrote:
           | Are you asserting that parents should not have the rights to
           | determine this for their own children?
        
             | brightball wrote:
             | Parents don't have the right to get their kids a tattoo,
             | vote or buy alcohol before a regulated age. Is this
             | different?
        
               | WarOnPrivacy wrote:
               | So your answer is yes?
               | 
               | Your position is that decisions about youth access to
               | social media should be fully taken from the parents and
               | made by govs instead. Penalties can be assumed from your
               | examples.
               | 
               | This is the reality that you want imposed on parents and
               | children - yes?
        
               | brightball wrote:
               | Is your position that no age limits should exist for
               | anything?
        
               | samus wrote:
               | This is already the reality for alcohol and plenty of
               | other things. Maybe not everywhere. Reality check:
               | parents giving unrestricted access to these things are
               | usually perceived as irresponsible.
        
               | somenameforme wrote:
               | There's actually a really simple and elegant penalty -
               | forfeiture of the device used to access the social media.
               | With all seized devices to then be wiped and donated to
               | low income school districts/families. This gets more
               | complex when using something like a school computer, but
               | I think it's a pretty nice solution outside of that.
               | That's going to be a tremendous deterrent, yet also not
               | particularly draconian.
        
               | runsWphotons wrote:
               | Yep thats what low income families need: more cell
               | phones.
        
               | foobarian wrote:
               | Now they too can get on social media!
        
               | 6yyyyyy wrote:
               | Actually, in many states there is no minimum age to get a
               | tattoo with parental consent.
               | 
               | https://en.wikipedia.org/wiki/Legal_status_of_tattooing_i
               | n_t...
        
               | brightball wrote:
               | That is...shocking
        
               | lupire wrote:
               | Likewise for cosmetic surgical modification of the penis
               | by a doctor, or puncturing the earlobe by a non-medical-
               | professional.
        
               | 20after4 wrote:
               | Missouri law allows minors to consume alcohol if
               | purchased by a parent or legal guardian and consumed on
               | their private property.
               | 
               | edit: Apparently Missouri is not the only state. I had
               | trouble finding a definitive list though. There are also
               | other exceptions such as wine during religious service.
        
               | WindyLakeReturn wrote:
               | While there are exceptions, and in general exceptions
               | seem pretty common, they still require businesses to
               | officially get approval and it gives power to parents to
               | enforce rules that would otherwise be hard to do so. Even
               | with the exceptions for children to legally be allowed to
               | drink, I would be surprised if that led to more kids
               | drinking than alcohol obtained illegally, which means the
               | question should be back on how well does the law work
               | (obviously not perfectly, but there is a large gap
               | between perfect and so poorly that it is useless purely
               | from an efficiency perspective).
        
             | giarc wrote:
             | Are you a parent? It's not as easy as saying "no social
             | media" to your kids. In this day and age, it's basically
             | equivalent to saying "you can't have friends". Online is
             | where kids meet, hang out, converse etc. I'd LOVE to go
             | back to the days before phones and social media, where kids
             | played with neighbours and ride their bikes to their
             | friends house, but that's slowly slipping away.
        
               | foobarian wrote:
               | We try pretty hard to get our kids to play with their
               | friends in person (we invite them or give rides to
               | playdates) but what do they do when they meet up? Sit on
               | the couch with their tablets and play virtually in Roblox
               | :-)
        
               | amalcon wrote:
               | When my friends would meet up in the 80's-90's, quite a
               | bit of Nintendo happened. Is it really that different?
               | The proportion of video games should eventually drop (not
               | to zero), in favor of (if you're lucky) talking and
               | whatever music bothers you the most.
        
               | eitally wrote:
               | You organize playdates for middle & high schoolers?
        
             | somenameforme wrote:
             | I'm almost invariably anti-regulation, but in this case -
             | absolutely!
             | 
             | There's extensive evidence that social media is
             | exceptionally harmful to people, especially children. And
             | in this case there's also a major network effect. People
             | want to be on social media because their friends are. It's
             | like peer pressure, but bumped up by orders of magnitude,
             | and in a completely socially acceptable way. When it's
             | illegal that pressure will still be there because of course
             | plenty of kids will still use it, but it'll be greatly
             | mitigated and more like normal vices such as
             | alcohol/smoking/drugs/etc. It'll also shove usage out of
             | sight which will again help to reduce the peer pressure
             | effects.
             | 
             | This will also motivate the creation/spread/usage of means
             | of organizing/chatting/etc outside of social media. This
             | just seems like a massive win-win scenario where basically
             | nothing of value is being lost.
        
             | amerkhalid wrote:
             | Yes, parents should not have unlimited rights to determine
             | what is good/bad for their children.
             | 
             | Social media is a powerful, addictive and dangerous. Pretty
             | much anywhere on this earth, parents will end up in jail or
             | lose custody of their kids, if they give them harmful
             | substances like drugs. Social media should be regulated
             | like how drugs, alcohol, and cigarettes are regulated.
        
               | 20after4 wrote:
               | Because drug regulations are so effective with no
               | collateral damage at all. /s
        
             | lupire wrote:
             | Parents can make accounts to use on collaboration with
             | their children.
             | 
             | The law prevents the corporation from directly engaging
             | with the child without parental oversight.
        
             | ryandvm wrote:
             | Do you also think that children should be allowed to buy
             | cigarettes? I'll be honest, I am not certain that social
             | media is any less deleterious than tobacco.
             | 
             | I'm a pretty pro-market guy, but there are times when the
             | interests of the market are orthogonal to the interests of
             | mankind.
        
               | tstrimple wrote:
               | I can pretty confidently say that the half a million
               | deaths a year attributable to smoking is a little more
               | deleterious than getting bullied online and the suicides
               | which follow. Many orders of magnitude more.
        
               | nkohari wrote:
               | Just because one thing is worse than the other doesn't
               | mean the less-bad thing is suddenly good.
        
               | tstrimple wrote:
               | The original statement was:
               | 
               | > "Do you also think that children should be allowed to
               | buy cigarettes? I'll be honest, I am not certain that
               | social media is any less deleterious than tobacco."
               | 
               | To which I pointed out that cigarettes kill far more
               | people than social media. And your response was somehow
               | that I'm implying that a less-bad thing is good? Are you
               | sure you're following the conversation? It's really not
               | clear that you're addressing anything I said, and it's
               | unclear what your point is.
        
           | 1shooner wrote:
           | Because regulation worked so well in eliminating peer
           | pressure for drinking, smoking, and drugs.
        
             | scythe wrote:
             | Drinking, smoking and drugs don't depend on a central point
             | of control. Social media companies of any significance can
             | be counted on two hands, and are accountable to corporate
             | boards.
        
             | ceejayoz wrote:
             | Regulation worked _remarkably_ well on smoking.
        
               | anshumankmr wrote:
               | How so?
        
               | ceejayoz wrote:
               | A massive, decades-long decline in the habit? Stemming
               | from ad restrictions, warning labels, media campaigns,
               | taxation, legal action by states/Feds, etc.
        
               | akerl_ wrote:
               | This must be why there's not 50 vape shops in my town
               | with big neon signs.
        
               | ceejayoz wrote:
               | Don't get me wrong, I'd love to see vaping turn into a
               | prescription-only smoking cessation aid, but it's not
               | _smoking_. I 'm 100% happy with even a one-for-one
               | replacement of smoking for vaping, even in kids, given
               | the dramatically lower risk of resulting health problems.
        
               | LynxInLA wrote:
               | Have you not seen people vaping at a young age now?
        
               | ceejayoz wrote:
               | Vaping isn't the same thing as smoking.
        
               | lbhdc wrote:
               | They are consuming nicotine, and tobacco companies are
               | invested in / are the companies producing products in
               | that space. It seems functionally to be the same.
        
               | ceejayoz wrote:
               | Only if you ignore... a lot. Cancer rates, smoking
               | sections in restaurants, the smell, the yellow grime and
               | used butts sprinkled everywhere, the impact on
               | asthmatics... Smoking a cigarette gets you a lot more
               | than just the nicotine.
               | 
               | A smoker moving to vaping is an _enormous_ benefit to
               | health and society.
        
               | lbhdc wrote:
               | That sounds like you are being disingenuous. Smoking
               | sections haven't been a thing in the US for a long time
               | (I went to the last one I could find around 2009). Waste
               | from single use vapes is also a huge problem. Similarly
               | there are health effects specific to vaping, time will
               | tell if cancer is among them.
        
               | ceejayoz wrote:
               | > Smoking sections haven't been a thing in the US for a
               | long time...
               | 
               | Yes, the regulations that made this happen are good.
               | That's my point.
               | 
               | > Waste from single use vapes is also a huge problem.
               | 
               | Nothing like the cigarette butts that used to be
               | everywhere.
               | 
               | > Similarly there are health effects specific to vaping,
               | time will tell if cancer is among them.
               | 
               | We've plenty of data to safely conclude vaping is safer
               | than smoking tobacco. That doesn't make it _safe_ , but
               | it's absolutely safe _r_.
        
               | lbhdc wrote:
               | You aren't exactly disproving my point. It seems like
               | vaping is close enough to smoking to say that it is
               | functionally close enough to be equated.
               | 
               | But you have raised my curiosity about your relationship
               | with vaping. Do you work in the industry?
        
               | ceejayoz wrote:
               | > It seems like vaping is close enough to smoking to say
               | that it is functionally close enough to be equated.
               | 
               | Bullshit. Both are nicotine delivery methods. One is far
               | better for both individual and societal reasons. Water
               | and whiskey are both wet, but that doesn't make them the
               | _same_.
               | 
               | > But you have raised my curiosity about your
               | relationship with vaping. Do you work in the industry?
               | 
               | No, nor do I vape/smoke. I'm just old enough to remember
               | how shitty it was to have smokers everywhere, in a way
               | that isn't the case for vapers... and I've seen the
               | multi-decade decline in lung cancer incidence stats.
        
               | sneak wrote:
               | No, they are not remotely the same. Nicotine isn't really
               | that harmful, but combustion byproducts very much are.
               | Also the effects on bystanders are orders of magnitude
               | better.
               | 
               | Most of the harm from smoking comes from the smoke, not
               | the nicotine or associated addiction.
        
               | huytersd wrote:
               | Nicotine by itself is harmless besides the addictiveness.
               | A nicotine addiction is not going to drastically affect
               | your mental state or cause socially disruptive behavior
               | like domestic violence or armed robbery so it's really
               | nothing to be concerned about.
        
               | LynxInLA wrote:
               | I agree it is different, but the jury is out on whether
               | it is better. Banning "social media" is likely to push
               | users to a "lite" version of it. I'm not convinced that
               | will be better.
        
               | bigfishrunning wrote:
               | I would call IRC "social media lite", and it is indeed
               | better.
        
               | huytersd wrote:
               | Vaping (non flavored) vapes has basically a negligible
               | impact on your health.
        
               | brodouevencode wrote:
               | It's more influence than actual regulation enforcement.
               | Smokers these days are seen as social pariahs in some
               | circles.
        
               | ScoutOrgo wrote:
               | You also had to go in store to buy cigarettes, so the
               | application of regulation would work a little differently
               | in the case of social media.
        
             | DiggyJohnson wrote:
             | It absolutely did for smoking and drunk driving.
        
             | CJefferson wrote:
             | I'm fairly sure more 13 year olds are on social media, than
             | are drinking, smoking, or on drugs.
        
             | huytersd wrote:
             | Absolutely. Almost no kids smoke cigarettes (vaping non
             | flavored varieties have almost no risk associated with it)
             | and drunk driving is a shadow of what it used to be.
             | Getting alcohol for someone under 21 is not child's play
             | either.
        
         | tedajax wrote:
         | Answering a problem with unenforceable garbage like this
         | doesn't seem like a very sound strategy.
        
           | soco wrote:
           | This discussion can be seen every time when the EU decides on
           | some regulation against tech industry. A lot of people will
           | jump that it won't be enforced, then when we see the first
           | fines those people will jump that it won't move a needle,
           | then when the tech giants do change a bit their course
           | then... well the tech bros will always find a reason to jump
           | against doing anything to curb tech.
        
             | lupire wrote:
             | This is not a "tech bro" thing. It's not particular to tech
             | nor bros. This a business thing. Phillip Morris weren't
             | "tech bros".
        
               | soco wrote:
               | You are right. I was having in mind the HN crowd when I
               | commented.
        
           | brodouevencode wrote:
           | Enforceability is a foregone conclusion, and when it comes to
           | things like this it's somewhat expected. The same can be said
           | for pornography, drugs and alcohol and tobacco (remember Joe
           | Camel?), and anything else that would fall under blue-laws.
           | 
           | The goal of this is to bring attention to the fact that it's
           | a problem and should be seen as undesirable, like pornography
           | or Joe Camel. The cancellation of Joe didn't prevent kids
           | from getting cigarettes but it did draw attention to the
           | situation and there has been a marked decline in youth
           | smoking since the late 90s when the mascot was removed. It's
           | correlative, for sure, but the outcomes are undeniable. The
           | same happened with the DARE program and class 1 drugs (except
           | for marijuana iirc).
        
             | 20after4 wrote:
             | Multiple studies have determined that the dare program is
             | entirely or almost entirely ineffective.
             | 
             | Here's just one:
             | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1448384/
        
           | o11c wrote:
           | It doesn't have to be perfectly enforceable to have a
           | positive effect.
           | 
           | Even just making illegal the _promotion_ of social media
           | toward children would have a huge effect.
        
         | wonderwonder wrote:
         | I find myself agreeing with this. Me 15 years ago would have
         | raged at this. I have kids now and they are pressured to join
         | all sorts of social media platforms. I still don't allow them
         | to have it but I know they take a slight social hit for it.
         | 
         | There is zero positive to giving kids the ability to access
         | social media sites designed to be addictive when they don't
         | have the mental facilities to determine real from not real.
         | Many adults seem to suffer from this as well. Plus kids don't
         | understand that the internet is forever, really no need for an
         | adult looking for a job or running for office to be crippled by
         | a questionable post they made as an edgy teen.
         | 
         | I'm against a lot of government regulation but in this case I
         | am even more against feeding developing kids to an algorithm
         | 
         | Just remove the temptation and pressure all together.
        
           | runsWphotons wrote:
           | The internet is just as real as anything else. Out of touch.
        
         | lukev wrote:
         | But the only way to do this is to require ID checks,
         | effectively regulating and destroying the anonymous nature of
         | the internet (and probably unconstitutional under the First
         | Amendment, to boot.)
         | 
         | It's the same problem with requiring age verification for porn.
         | It's not that anyone _wants_ kids to have easy access to this
         | stuff, but that any of these laws will either be (a)
         | unenforceable and useless, or (b) draconian and privacy-
         | destroying.
         | 
         | The government doesn't get to know or regulate the websites I'm
         | visiting, nor should it. And "protecting the children" isn't a
         | valid reason to remove constitutional rights from adults.
         | 
         | (And if it is, let's start talking about gun ownership
         | first...)
        
           | thinkingtoilet wrote:
           | > effectively regulating and destroying the anonymous nature
           | of the internet.
           | 
           | Not at all. Just the social media sites, which are
           | objectively bad for kids. As an adult, you do what you want
           | on the internet.
        
             | lukev wrote:
             | And what makes a site a social media site? Anywhere you can
             | post interactive content?
             | 
             | You do realize that laws like this would apply to sites
             | like HN, Reddit, the comment section of every blog, and
             | every phpBB forum you ever used? It's not just Instagram
             | and Tiktok.
        
               | anonym29 wrote:
               | Trying to force independently owned and operated forums
               | to enforce laws that might not even be applicable in the
               | country that the owners / admins live and work in is
               | going to be about as effective as trying to force foreign
               | VPS/server/hosting providers to delete copyrighted
               | content from their server using laws that don't apply in
               | their jurisdiction.
        
               | thinkingtoilet wrote:
               | I think a perfectly clear line could be drawn that would
               | separate out phpBB from TikTok very easily. I genuinely
               | don't understand this comment, we shouldn't do it because
               | it's hard or the results might be imperfect?
        
               | lukev wrote:
               | Kids _want_ to communicate. Whether it 's TikTok,
               | Discord, phpBB, chatting in Roblox or Minecraft, they
               | will if they can.
               | 
               | If we want to "ban social media" we'll need a consistent
               | set of guidelines about what counts as social media and
               | what doesn't, and what exactly the harms are so they can
               | be avoided.
               | 
               | I don't believe that's as easy as you think.
        
               | CalRobert wrote:
               | Any clear line would be gamed pretty quickly I imagine.
        
               | woodruffw wrote:
               | I think your comment would be much stronger if you laid
               | out _precisely_ what you think that line would be.
               | 
               | Laws do not have to be perfect to be good, but they _do_
               | have to be workable. It 's not clear that there's a
               | working definition of "social media" that includes both
               | TikTok and Reddit but doesn't include random forums.
        
               | thinkingtoilet wrote:
               | So if a random person on the internet doesn't have a
               | perfect solution then it shouldn't be considered?
        
               | lazyasciiart wrote:
               | Their opinion that it would be straightforward shouldn't
               | be considered.
        
           | JeremyNT wrote:
           | Yes, this isn't the right solution. The power needs to be
           | given to the users.
           | 
           | A better solution is more robust device management, with
           | control given to the device owner (read: the parent). The
           | missing legislative piece is mandating that social media
           | companies need to respond differently when the user agent
           | tells them what to send.
           | 
           | I should be able to take my daughter's phone (which _I_ own),
           | set an option somewhere that indicates  "this user is a
           | minor," and with every HTTP request it makes it sets e.g. an
           | OMIT_ADULT_CONTENT header. Site owners simply respond
           | differently when they see this.
        
             | notsound wrote:
             | Another option that allows for better privacy and
             | versatility is the website setting a MIN_USER_AGE header
             | based on IP geolocation.
        
               | Terr_ wrote:
               | Geolocation to that degree not that reliable and not
               | necessarily 1:1 with jurisdiction or parental intent.
               | 
               | If we're already trusting a parental-locked device to
               | report minor-status, then it's trivial to also have it
               | identify what jurisdiction/ruleset exists, or some finer-
               | grained model of what shouldn't work.
               | 
               | In either case, we have the problem of how to model
               | things like "in the Flub province of the nation of
               | Elbonia children below 150.5 months may not see media
               | containing exposed ankles". OK, maybe not quite that bad,
               | but the line needs to be drawn somewhere.
        
             | Terr_ wrote:
             | Exactly, any design for this stuff _requires_ parental-
             | involvement because every approach without it is either (A)
             | uselessly-weak or (B) creepy-Orwellian.
             | 
             | If we assume parents are involved enough to "buy the thingy
             | that advertises a parental lock", then a whole bunch of
             | less-dumb options become available... And more of the costs
             | of the system will be borne by the people (or at least
             | groups) that are utilizing it.
        
             | cmiles74 wrote:
             | Emancipation of children is also a thing, where a minor may
             | petition the court to be treated as an adult. This also
             | falls afoul of a blanket age restriction.
             | 
             | https://jeannecolemanlaw.com/the-legal-emancipation-of-
             | minor...
        
             | PH95VuimJjqBqy wrote:
             | then propose the RFC.
             | 
             | I haven't read the legislation myself but I don't see why
             | this couldn't still be done, I doubt the legislation
             | specified _how_ to do it.
        
               | Izkata wrote:
               | > "Reasonable age verification method" means any
               | commercially reasonable method regularly used by
               | government agencies or businesses for the purpose of age
               | and identity verification.
               | 
               | So no, that wouldn't work right now.
        
             | cesarb wrote:
             | Sounds like you want PICS (though it works on the opposite
             | direction, with the web site sending the flag, and the
             | browser deciding whether to show the content based on it).
        
             | basil-rash wrote:
             | Already exists, simply include the header
             | 
             | Rating: RTA-5042-1996-1400-1577-RTA
             | 
             | in HTTP responses that include adult content, and every
             | parental controls software in existence will block it by
             | default, including the ones built into iPhones/etc and
             | embedded webviews. As far as I know all mainstream adult
             | sites include this (or the equivalent meta tag) already.
             | 
             | In general, I don't think communicating to every site you
             | visit that you are a minor and asking them to do what they
             | will with that information is a good idea. Better to filter
             | on the user's end.
        
               | JeremyNT wrote:
               | I honestly wasn't aware of this, and it sounds like a
               | great solution for "adult content." Certainly, the site
               | specifying this is better than the user agent having to
               | reveal any additional details about its configuration.
        
           | jlokier wrote:
           | > But the only way to do this is to require ID checks,
           | effectively regulating and destroying the anonymous nature of
           | the internet
           | 
           | That seems intuitive, but it's not actually true. I suggest
           | looking up zero-knowledge proofs.
           | 
           | Using modern cryptography, it is easy to send a machine-
           | generated proof to your social media provider that your
           | government-provided ID says your age is >= 16, without
           | revealing anything else about you to the service provider
           | (not even your age), and without having to communicate with
           | the government either.
           | 
           | The government doesn't learn which web sites you visit, and
           | the web sites don't learn anything about you other than you
           | are certified to be age >= 16. The proofs are unique to each
           | site, so web sites can't use them to collude with each other.
           | 
           | That kind of "smart ID" doesn't have to be with the
           | government, although that's often a natural starting point
           | for ID information. There are methods which do the same based
           | on a consensus of people and entities that know you, for
           | example. That might be better from a human rights
           | perspective, given how many people do not have citizenship
           | rights.
           | 
           | > (and probably unconstitutional under the First Amendment,
           | to boot.)
           | 
           | If it would be unconstitutional to require identity-revealing
           | or age-revealing ID checks for social media, that's all the
           | more reason to investigate modern technical solutions we have
           | to those problems.
        
             | lukev wrote:
             | It'd be cool if any of the proposed bills actually
             | suggested something like this. They do not. They specify an
             | ID check.
        
             | emporas wrote:
             | Definitely, we can use a government issued id, or we can
             | create our own. Social graphs i call em. Zero knowledge
             | proofs have so many ground breaking applications. I have
             | made a comment in the past, relevant to how could a social
             | graph be build, without the need of any government [1]. We
             | can create effectively one million new governments to
             | compete with existing ones.
             | 
             | [1] https://news.ycombinator.com/item?id=36421679
        
               | paulryanrogers wrote:
               | Governments monopolize violence. At least at the
               | foundational level. When too many of them compete at once
               | it can get very messy very quickly.
        
               | thomastjeffery wrote:
               | I've been thinking a lot lately about decentralized
               | moderation.
               | 
               | All we need to do is replace the word "moderate" with
               | "curate". Everything else is an attestation.
               | 
               | We don't really need a blockchain, either. Attestations
               | can be asserted by a web of trust. Simply choose a
               | curator (or collection of curators) to trust, and you're
               | done.
        
             | satellite2 wrote:
             | I'm not a cryptographer so I might miss something but I
             | have the impression that
             | 
             | - either a stolen card can be reused thousands of time
             | meaning that it's so easy to get a fake that it's not worth
             | the implementation cost
             | 
             | - either there is away to uniquely identify a card and then
             | it becomes another identifier like tracking ids.
        
               | neom wrote:
               | It would be neat if some authority like the passport
               | office or social security office also provided a virtual
               | ID that includes the features OP described and allowed
               | specific individual attributes to be shared or not
               | shared, revoked any time, much like when you authenticate
               | a 3rd party app to gmail or etc.
        
               | ndriscoll wrote:
               | Assuming you can make active queries to the verifier, you
               | could do something like
               | 
               | - Have your backend generate a temporary AES key, and
               | create a request to the verifier saying "please encrypt a
               | response using AES key A indicating that the user coming
               | from ip X.Y.Z.W is over 16". Encrypt it with a known
               | public key for the verifier. Save the temporary AES key
               | to the user's session store.
               | 
               | - Hand that request to the user, who hands it to the
               | verifier. The verifier authenticates the user and gives
               | them the encrypted okay response.
               | 
               | - User gives the response back to your backend.
               | 
               | Potentially the user could still get someone to auth for
               | them, but it'd at least have to be coming from the same
               | IP address that the user tried to use to log into the
               | service. The verifier could become suspicious if it sees
               | lots of requests for the same user coming from different
               | IP addresses, and the service would become suspicious if
               | it saw lots of users verifying from the same IP address,
               | so reselling wouldn't work. You could still find an
               | over-16 friend and have them authenticate you without
               | raising suspicions though, much like you can find an
               | over-21 friend to buy you beer and cigarettes.
               | 
               | Since you use a different key with each user request, the
               | verifier can't identify the requesting service. Both the
               | service and the verifier know the user's IP, so that's
               | not sensitive. If you used this scheme for over-16 vs.
               | over-18 vs. over-21 services, the verifier _does_ learn
               | what level of service you are trying to access (i.e. are
               | you buying alcohol, looking at porn, or signing up for
               | social media). Harmonizing all age-restricted vices to a
               | single age of majority can mitigate that. Or, you could
               | choose to reveal the age bucket to the service instead of
               | the verifier by having the verifier always send back the
               | maximum bucket you qualify for instead of the service
               | asking whether the user is in a specific bucket.
        
             | johnhenry wrote:
             | > I suggest looking up zero-knowledge proofs.
             | 
             | Sure, but is the Florida legislature actually looking into
             | stuff like this?
        
             | thomastjeffery wrote:
             | You can send a proof that _someone 's_ government-provided
             | ID says that their age is >= 16.
             | 
             | That's not enough proof to levy a requirement.
        
             | Aurornis wrote:
             | > The government doesn't learn which web sites you visit,
             | and the web sites don't learn anything about you other than
             | you are certified to be age >= 16.
             | 
             | If the zero-knowledge proof doesn't communicate _anything_
             | other than the result of an age check, then the trivial
             | exploit is for 1 person to upload an ID to the internet and
             | every kid everywhere to use it.
             | 
             | It's not sufficient to check if someone has access to an ID
             | where the age is over a threshold. Implementing a 1:1
             | linkage of real world ID to social media account closes the
             | loophole where people borrow, steal, or duplicate IDs to
             | bypass the check.
        
           | einpoklum wrote:
           | > destroying the anonymous nature of the internet
           | 
           | Aren't the really problematic social networks the ones where
           | you've lost your privacy and anonymity long ago and are being
           | tracked and mined like crazy?
        
             | lukev wrote:
             | That's like saying "80% of the internet has gone to shit,
             | might as well destroy the remaining good 20%".
        
               | jf22 wrote:
               | I don't think it's like saying that at all.
        
           | paulddraper wrote:
           | > privacy-destroying
           | 
           | We ARE talking about social media.
           | 
           | Like, the least private software on the planet.
        
           | blitz_skull wrote:
           | You know, I'm not really sure that requiring IDs for access
           | to porn / social media is a terrible idea. Sure it's been
           | anonymous and free since the advent of the internet, but
           | perhaps it's time to change that. After all, we don't allow a
           | kid into a brothel or allow them to engage in prostitution
           | (for good reasons), and porn is equally destructive.
           | 
           | But with the topic at hand being social media, I think a lot
           | of the same issues and solutions apply. It's harmful to allow
           | kids to interact with anyone and everyone at any given time.
           | Boundaries are healthy.
           | 
           | Aaaaand, finally there's much less destruction of human
           | livelihood by guns than both of the aforementioned topics if
           | we measure "destruction" by "living a significantly
           | impoverished life from the standard of emotional and mental
           | wellbeing". I doubt we could even get hard numbers on the
           | number of marriages destroyed by pornography, which yield
           | broken households, which yield countless emotional and mental
           | problems.
           | 
           | So, no, guns aren't something we should discuss first. Also,
           | guns have utility including but not limited to defending
           | yourself and your family. Porn has absolutely zero utility,
           | and social media is pretty damn close, but not zero utility.
        
             | tempestn wrote:
             | You think watching porn is equally destructive to engaging
             | in prostitution? I'd hate to see what kind of porn you're
             | watching.
        
             | diputsmonro wrote:
             | The biggest problem with this is how we would define
             | "porn". Some states are currently redefining the existence
             | of a transgender person in public as an inherently lewd act
             | equivalent to indecent exposure.
             | 
             | I have no doubt that if your proposal were to pass that
             | there would be significant efforts from extremist
             | conservatives to censor LGBT+ communities online by
             | labeling sex education or mere discussion of our lives as
             | pornographic. How are LGBT+ people supposed to live if our
             | very existence is considered impolite?
             | 
             | Nevermind the fact that the existence of a government
             | database of all the (potentially weird) porn you look at is
             | a gold mine for anyone who wants to blackmail or pressure
             | you into silence.
             | 
             | The horrors and dangers of porn are squarely a domestic and
             | family issue. The government does not need to come into my
             | bedroom and look over my shoulder.
        
           | paulddraper wrote:
           | > destroying the anonymous nature of the internet (and
           | probably unconstitutional under the First Amendment, to
           | boot.)
           | 
           | The First Amendment guarantees free expression, not
           | _anonymous_ expression.
           | 
           | For example, there are federal requirements for
           | identification for political messages. [1] These requirements
           | do not violate the First Amendment.
           | 
           | [1] https://www.fec.gov/help-candidates-and-
           | committees/advertisi...
        
             | o11c wrote:
             | In particular, "anonymous speech is required if you want to
             | have free speech" is actually a very niche position, not a
             | mainstream one. It just happens to be widely spammed in
             | certain online cultures.
        
               | paulddraper wrote:
               | Correct.
               | 
               | I am staunch believer in the moral and societal good of
               | free speech.
               | 
               | Anonymous speech is far more dubious.
               | 
               | Like, protests seem valuable. Protests while wearing
               | robes and masks however...
        
               | TillE wrote:
               | > Protests while wearing robes and masks however
               | 
               | This is absolutely permitted in America. It is illegal in
               | countries like Germany which have no strong free speech
               | protections.
        
               | jonathankoren wrote:
               | No. It is not "absolutely permitted in America".
               | 
               | They're Klan Acts, because a bunch of guys in masks and
               | robes marching through is obviously a threat of violence.
               | 
               | https://en.wikipedia.org/wiki/Anti-mask_law
        
               | TillE wrote:
               | America has a very long tradition of anonymity being part
               | of free speech, going back to the Federalist Papers. This
               | is not some new online issue.
        
           | Terr_ wrote:
           | > But the only way to do this is to require ID checks
           | 
           | Not _necessarily_ , consider the counterexample of devices
           | with parental-controls which--when locked--will always send a
           | "this person is a minor" header. (Or "this person hits the
           | following jurisdictional age-categories", or some blend of
           | enough detail to be internationally useful and little-enough
           | to be reasonably private and not-insane to obey.)
           | 
           | That would mostly puts control into the hands of parents, at
           | the expense of sites needing some kind of code-library that
           | can spit out a "block or not" result.
        
           | RandallBrown wrote:
           | The definition of "social media" in this bill actually seems
           | to exempt anonymous social networks since it requires the
           | site "Allows an account holder to interact with or track
           | other account holders".
        
           | giantg2 wrote:
           | "probably unconstitutional under the First Amendment, to
           | boot"
           | 
           | Probably not. Minors have all sorts of restrictions on
           | rights, including first amendment restrictions such as in
           | schools.
           | 
           | "(And if it is, let's start talking about gun ownership
           | first...)"
           | 
           | Are you advocating for removing ID checks for this? If not,
           | it seems that this point actually works against your
           | argument.
           | 
           | Not saying that I agree with a ban, but your arguments
           | against it don't really stand.
        
           | qwertox wrote:
           | Lately I'm repeatedly reminded of how in Ecuador citizens,
           | when interviewed during a protest, see it as a normal thing
           | to tell their name as well as their personal ID number into
           | the camera when also speaking about their position in regards
           | of the protest. They stand to what they are saying without
           | hiding.
           | 
           | Since about half a year I've noticed the German Twitter
           | section getting sunk in hate posts, people disrespecting each
           | other, ranting about politicians or ways of thinking, but
           | being really hateful. It's horrible. I've adblocked the
           | "Trending" section away, because its the door to this
           | horrible place where people don't have anything good to share
           | anymore but disrespect and hate.
           | 
           | This made me think about what we're really in need for, at
           | least here in Germany, is a Twitter alternative, where people
           | register by using their eID and can only post by using their
           | real name. Have something mean to say? Say it, but attach
           | your name to it.
           | 
           | This anonymity in social media is really harming German
           | society, at least as soon as politics are involved.
           | 
           | I don't know exactly how it is in the US but apparently it
           | isn't as bad as here, at least judging from the trending
           | topics in the US and skimming through the posts.
        
             | Kye wrote:
             | People have zero qualms about being absolute ghouls under
             | their wallet names. The people with the most power in
             | society don't need anonymity. The people with the least
             | often can't safely express themselves without it.
             | 
             | Also:
             | 
             | https://theconversation.com/online-anonymity-study-found-
             | sta...
             | 
             | >> _" What matters, it seems, is not so much whether you
             | are commenting anonymously, but whether you are invested in
             | your persona and accountable for its behaviour in that
             | particular forum. There seems to be value in enabling
             | people to speak on forums without their comments being
             | connected, via their real names, to other contexts. The
             | online comment management company Disqus, in a similar
             | vein, found that comments made under conditions of durable
             | pseudonymity were rated by other users as having the
             | highest quality. "_
        
               | qwertox wrote:
               | There are two points which matter:
               | 
               | - No more bots or fake propaganda accounts.
               | 
               | - Illegal content, such as insults or the like, will not
               | get published. And if it does, it will have direct
               | consequences.
               | 
               | I'm also not tending towards a requirement to have all
               | social networks ID'd, but I think that a Twitter
               | alternative which enables a more serious discussion
               | should exist. A place where politicians and journalists
               | or just citizens can post their content and get commented
               | on it, without all that extreme toxicity from Twitter.
        
               | belval wrote:
               | The thing is, the political climate is very toxic and the
               | absence of anonymity can have a real impact for things
               | that are basically wrong think.
               | 
               | Say for example I held the opinion that immigration
               | threshold should be lower. No matter how many non-
               | xenophobic justifications I can put on that opinion, my
               | possibly on H1B colleagues can and would look up my
               | opinion on your version of Twitter and it would have a
               | real impact on my work life.
               | 
               | There is a reason why we hold voting in private, it's
               | because when boiled down to its roots, there are
               | principles that guide your opinions they are usually non
               | -reconciliable with someone else's opinion and we
               | preserve harmony by keeping everyone ignorant of their
               | colleagues political opinions. It's not a bad system, but
               | it's one that requires anonymity
        
               | DylanDmitri wrote:
               | Or, to post on a political forum you must have an ID. You
               | can have and post from multiple accounts, but your id and
               | all associated accounts can be penalized for bad
               | behavior.
        
             | pohuing wrote:
             | Plenty of hate under plain names on Facebook, been that way
             | for a decade and I doubt it will change with ID
             | verification.
        
             | simmerup wrote:
             | Attach their pictures too, so you can see the ghoul
             | spouting hate is a basement dweller
        
             | Zak wrote:
             | The algorithm powering the trending section, which rewards
             | angry replies and accusatory quote-tweets is at least a
             | good a candidate as a source of harm to political discourse
             | than anonymity.
        
               | adaptbrian wrote:
               | Take it a step forward, ban Engagement based algorithmic
               | feeds. I've said this and I'll continue to say this type
               | of behavioral science was designed at FB by a small group
               | of people and needs to be outlawed. It never should have
               | been allowed to take over the new age monetization
               | economy. There's so much human potential with the
               | internet and its absolutely trainwrecked atm b.c of
               | Facebook.
        
             | claytongulick wrote:
             | I wonder what percentage of the hate stuff is bots.
        
           | giancarlostoro wrote:
           | > But the only way to do this is to require ID checks
           | 
           | COPPA has entered the building. If you're under 13 and a
           | platform finds out, they'll usually ban you until you prove
           | that you're not under 13 (via ID) or can provide signed forms
           | from your parent / legal guardian.
           | 
           | I've seen dozens of people if not more over the years banned
           | from various platforms over this. We're talking Reddit,
           | Facebook, Discord and so on.
           | 
           | I get what you're saying, but it kind of is a thing already,
           | all one has to do is raise the age limit from 13 to say... 16
           | and voila.
        
             | woodruffw wrote:
             | "Finds out" is the operative part. COPPA is not a
             | _proactive_ requirement; it 's a _reactive_ one. Proactive
             | legislation is a newer harm that can 't easily be predicted
             | based on past experiences with reactive laws.
        
               | giancarlostoro wrote:
               | Indeed, nothing is stopping said companies from scanning
               | and assessing age of a user uploading selfies though.
               | This is allegedly something that TikTok does. My point
               | being, the framework is there, and then people actually
               | report minors, the companies have to take it seriously,
               | or face serious legal consequences.
        
           | NoMoreNicksLeft wrote:
           | >But the only way to do this is to require ID checks,
           | effectively regulating and destroying the anonymous nature of
           | the internet
           | 
           | Ban portable electronics for children. Demand that law
           | enforcement intervene any time it's spotted in the wild. If
           | you still insist that children be allowed phones, dumb flip
           | phones for them.
           | 
           | It could be done if there was the will to do it, it just
           | won't be done.
        
           | slily wrote:
           | People can post all kinds of illegal things online and no one
           | is suggesting that content should be approved before it can
           | be visible on the Internet. It doesn't have to be strictly
           | enforced to act as a deterrent. How effective of a deterrent
           | it would be has yet to be seen.
        
           | 1vuio0pswjnm7 wrote:
           | How many social media users who create accounts and "sign in"
           | are "anonymous". How would targeted advertising work if the
           | website did not "know" their ages and other demographic
           | information about them. Are the social media companies lying
           | to advertisers by telling them they can target persons in a
           | certain age bracket.
        
           | huytersd wrote:
           | Ah so be it. I don't care much for the things that come from
           | anonymous culture. I want gatekeepers. This tyranny of the
           | stupid online is pretty tiresome.
        
           | spogbiper wrote:
           | I propose the Leisure Suit Larry method. Just make users
           | answer some outdated trivia questions that only olds will
           | know when they sign up for an account.
        
             | badpun wrote:
             | In the Internet era, the answers will just be googleable.
             | People wil quickly compile a page with all possible
             | questions and with answers to them.
        
               | jonathankoren wrote:
               | But with ChatGPT, all the answers will be wrong.
        
           | onion2k wrote:
           | _effectively regulating and destroying the anonymous nature
           | of the internet_
           | 
           | Social media is _on_ the internet. It is not the internet.
        
           | dustedcodes wrote:
           | > The government doesn't get to know or regulate the websites
           | I'm visiting, nor should it.
           | 
           | They already do (e.g. gambling), and in my opinion they
           | should. Federal regulation is IMHO desired in cases where
           | there is a huge power imbalance between corporations and
           | consumers to the detriment of the nation. Social media
           | companies wield more power than the average parents can cope
           | with and it's hugely detrimental to have entire generations
           | of kids growing up being complete useless insecure
           | brainwashed unconfident depressed virgins who can't even
           | articulate the difference between man and woman. That's a
           | sick dying society and it's not good so it makes sense to
           | regulate the thing which is undoubtedly the root cause of it.
        
         | kcrwfrd_ wrote:
         | Yes. Rather than mandating verification, can we just mandate
         | that there a registry or that websites are legally required to
         | include a particular HTTP header, combined with opt-in
         | infrastructure in place for parents to use?
         | 
         | e.g. You could set up a restricted account on a device with a
         | user birthdate specified. Any requests for websites that return
         | an AGE_REQUIRED header that don't validate would be rejected.
        
         | blitz_skull wrote:
         | You're not the only one!
         | 
         | I also believe that this is a Big Deal(tm) that we need to take
         | seriously as a nation. I have yet to see any HN commentator
         | offer a robust pro-social media argument that carries any
         | weight in my opinion. The most common "they'll be isolated from
         | their peers" argument seems pretty superficial and can easily
         | be worked around with even a tiny amount of efforts on the
         | parents' part.
         | 
         | As an added bonus, this latest legislation removes the issue of
         | "everyone is doing it". I mean, sure, a lot still will be--but
         | then it's illegal and you get to have an entirely separate
         | conversation with your kid. :)
        
         | ben_w wrote:
         | > I might be the only one here in favor of this, and wanting to
         | see a federal rollout.
         | 
         | I'm not American, I think it's perfectly reasonable to ban kids
         | from the internet just by applying the logic used for film
         | classification. Even just the thumbnails for YouTube (some)
         | content can go beyond what I'd expect for a "suitable for all
         | audiences" advert.
         | 
         | This isn't an agreement with the specific logic used for film
         | classification: I also find it really weird how 20th century
         | media classification treated murder perfectly acceptable
         | subject in kids shows while the mere existence of functioning
         | nipples was treated as a sign of the end times (non-functional
         | nipples, i.e. those on men, are apparently fine).
         | 
         | Also, I find it hilariously ironic that Florida also passed the
         | "Stop Social Media Censorship Act". No self-awareness at all.
        
         | chankstein38 wrote:
         | No I definitely agree. I'm a little skeptical of how they'll
         | enforce this but ultimately I think less kids on the internet
         | and social media will be a positive and I agree that it doesn't
         | seem like parents have managed to figure out how to address
         | this.
        
         | dmitrygr wrote:
         | > I might be the only one here in favor of this
         | 
         | Not even close. I am with you
        
         | robotnikman wrote:
         | Same here. With the way the internet is nowadays, its probably
         | best to keep kids off the internet until they are older. One
         | just has to look at whats on places like Youtube 'Kids' to see
         | all the stuff that is not kid friendly and probably detrimental
         | to their mental health.
        
         | huytersd wrote:
         | Right there with you. This is probably the only thing I'm on
         | board with the republicans about.
        
         | jijijijij wrote:
         | I am an adult and I wish someone would take social media away
         | from me. Honestly, I think social media has done more harm than
         | good and I wish it would just cease to exist.
         | 
         | However, especially in Florida, social media may be the only
         | way for some teens to escape political and religious lunacy and
         | I fear for them. I think it's not wise to applaud them taking
         | away means of communication to the "outside", in the context of
         | legislation trends and events there.
        
         | EasyMark wrote:
         | I can't agree. This teaches kids that the government is the
         | answer to everything. this should 100% be the responsibility
         | and decision of parents. Kids are different, and these one-
         | size-fits-all authoritarian tactics that have become a
         | signature of the current GOP Floridian government are just the
         | beginning of the totalitarian christofascist laws that they
         | want to implement. Before you ask, I am a parent, and my kid's
         | devices all have this crap blocked and likely will remain that
         | way until he's at least 15, give or take a year depending on
         | what I determine when he gets to that age. He knows that there
         | are severe ramifications if he tries to work around my
         | decision, and will lose many, many privileges if such a thing
         | happens.
        
         | iouwhldkfjgklj wrote:
         | I'm very anti-giving-kids-screens and am also against this
         | bill.
         | 
         | Land of the free - let people do what they want. I know _my_
         | kid isn 't getting anywhere near a screen though.
        
         | tootie wrote:
         | I have two teens and have yet to see the negative effects of
         | social media for them or any of their peers. Not to say it
         | doesn't exist, but I sincerely doubt it's as awful as the
         | doomsayers think. My personal observation of being raised in he
         | 80s is that kids were far more awful to each other then than
         | now.
        
       | insane_dreamer wrote:
       | As the parent of an 11 year old, I wholeheartedly agree. The
       | science is pretty clear that social media has had a very
       | detrimental effect on teens' mental health. We should treat it
       | like we do other substances that are harmful for teens; once
       | they're older they are better able to make wiser decisions as to
       | if, when and how they want to consume social media.
       | 
       | It may be impossible to enforce outside school, but so is the
       | 13-year old limit on opening accounts (my kid's classmates all
       | have accounts; they just lie about their age). But that's not a
       | reason not to have it on the books, as it sets a social standard,
       | and more importantly puts pressure on social media companies.
        
         | jeffbee wrote:
         | The evidence is not all that solid. The most demonstrable link
         | is between use of portable devices at bedtime and poor sleep
         | quality. Everything else has mixed evidence.
        
         | coolbreezetft24 wrote:
         | Seems like it does even more harm to adults, so much of the
         | "content" is just a cess-pool of conspiracies and vitriol
        
       | k12sosse wrote:
       | Does this mean Florida will be the first state off the Internet?
       | 
       | If traffic source is Florida, redirect to null
       | 
       | Easier than implementing an ID verification platform that isn't a
       | massive tracking anklet for a 3rd party or government.
        
       | spacebacon wrote:
       | I would like to see regulation on notifications to address
       | reaction driven addictions as opposed to an outright ban.
       | Classical conditioning is clearly the issue at hand but opponents
       | are not referencing the proven science enough.
       | 
       | If we don't teach children how to use these platforms in
       | moderation now they will certainly not be educated on how to use
       | them responsibly in adulthood. I'm not against an outright ban
       | totally but we are missing educational opportunities with what is
       | likely an unenforceable attack on the problem.
        
       | jshaqaw wrote:
       | Does YouTube count? Yes I think TikTok is largely a hellscape for
       | my daughters around this age. But one of them learns all sorts of
       | crafting projects via YouTube and the other has taught herself an
       | incredible amount on how to draw. Would be a shame to throw away
       | access to resources like this with the bath water.
        
         | pavlov wrote:
         | The law has a list of applications that are specifically
         | excepted. User tzs posted it in this thread.
         | 
         | It seems like YouTube would be covered by the ban because it
         | doesn't fall under any of the exceptions. The closest one is
         | this:
         | 
         |  _" A streaming service that provides only licensed media in a
         | continuous flow from the service, website, or application to
         | the end user and does not obtain a license to the media from a
         | user or account holder by agreement to its terms of service."_
         | 
         | But of course YouTube does "obtain a license to the media from
         | a user or account holder", so it's not covered by this
         | exception.
        
         | AlecSchueler wrote:
         | Does she need her own account in her name where she can upload
         | though?
        
       | Ekaros wrote:
       | I would prefer it to be straight 18. Much more reasonable limit
       | and I think 16-18 is very vulnerable group from effects of social
       | media.
        
       | captainmuon wrote:
       | I guess for me it depends on what the law considers "social
       | media".
       | 
       | Is something like the bulletin boards we used to have around the
       | late 90s/early 2000s social media? What about chat rooms? Local
       | social web sites for the school or your city? I think a lot of
       | these things can even be beneficial, if I think about my own
       | experiences as a somewhat introverted teenager.
       | 
       | And what about things like Netflix, Youtube, Podcasts? They can
       | be just as harmful as TokTok and Instagram. Especially on Youtube
       | you have a lot of similar content.
       | 
       | I've found accounts that claim to be official accounts of
       | children's shows - maybe they even are - and which are full of
       | nonsensical videos, just randomly cut together action scenes of
       | multiple episodes. It's like crack for children. Of course
       | YouTube doesn't do anything, they want you to pay for YouTube
       | kids. And the rights holders want you to buy the content, so they
       | leave the poor quality stuff up.
       | 
       | The thing is, exploitative content is always going to be created
       | as long as there are incentives to do so. You can ban stuff, but
       | it's whack-a-mole, and you are going to kill a lot of interesting
       | stuff as collateral damage. The alternative is much harder,
       | change the incentives so we can keep our cool technology and
       | people are not awarded for making harmful stuff with it. But that
       | would require economic and political changes, and people don't
       | like to think about it.
        
         | scythe wrote:
         | > I guess for me it depends on what the law considers "social
         | media".
         | 
         | It's a bill written by the Florida House of Representatives, so
         | there's a definition there. Mind you, it's the Florida House,
         | which has put out some extremely bad laws in its current
         | session -- from "Parental Rights in Education" to the Disney
         | speech retaliation. But given that this is a less ostensibly
         | partisan issue, there are reasons for hope.
         | 
         | The definition seems narrowly tailored. I think that part (d)1d
         | is a questionable choice, since most social media platforms
         | will probably argue that they are not really "designed" to be
         | addictive (for various definitions of "designed" and
         | "addictive"). It appears that specific exemptions were made for
         | YouTube, Craigslist and LinkedIn (without mentioning those
         | companies by name), and algorithmic content selection is part
         | of the definition. This is one of the better versions of this
         | law I could imagine being written by a state legislature,
         | though it isn't without its faults. It's nice to see my home
         | state in the news for something good for once.
         | 
         | I agree that YouTube is a particularly difficult case. But part
         | of the problem comes from using it as a digital pacifier,
         | rather than peer pressure. There's no particular reason why the
         | technology market should produce a free stream of child-
         | appropriate videos. Ad-supported media has its ups and downs,
         | but when the targets of those ads are young children, it's much
         | harder to defend. And parents have more control over the
         | behavior of their 4-year-olds than their 14-year-olds.
         | 
         | Here's the definition:
         | 
         | >(d) "Social media platform:"
         | 
         | >1. Means an online forum, website, or application offered39 by
         | an entity that does all of the following:
         | 
         | >a. Allows the social media platform to track the activity of
         | the account holder.
         | 
         | >b. Allows an account holder to upload content or view the
         | content or activity of other account holders.
         | 
         | >c. Allows an account holder to interact with or track other
         | account holders.
         | 
         | >d. Utilizes addictive, harmful, or deceptive design features,
         | or any other feature that is designed to cause an account
         | holder to have an excessive or compulsive need to use or engage
         | with the social media platform.
         | 
         | >e. Allows the utilization of information derived from the
         | social media platform's tracking of the activity of an account
         | holder to control or target at least part of the content
         | offered to the account holder.
         | 
         | >2. Does not include an online service, website, or application
         | where the predominant or exclusive function is:
         | 
         | >a. Electronic mail.
         | 
         | >b. Direct messaging consisting of text, photos, or videos that
         | are sent between devices by electronic means whe re messages
         | are shared between the sender and the recipient only, visible
         | to the sender and the recipient, and are not posted publicly.
         | 
         | >c. A streaming service that provides only licensed media in a
         | continuous flow from the service, website, or application to
         | the end user and does not obtain a license to the media from a
         | user or account holder by agreement to its terms of service.
         | 
         | >d. News, sports, entertainment, or other content that is
         | preselected by the provider and not user generated, and any
         | chat, comment, or interactive functionality that is provided
         | incidental to, directly related to, or dependent upon provision
         | of the content.
         | 
         | >e. Online shopping or e-commerce, if the interaction with
         | other users or account holders is generally limited to the
         | ability to upload a post and comment on reviews or display
         | lists or collections of goods for sale or wish lists, or other
         | functions that are focused on online shopping or e-commerce
         | rather than interaction between users or account holders.
         | 
         | > f. Interactive gaming, virtual gaming, or an online service,
         | that allows the creation and uploading of content for the
         | purpose of interactive gaming, edutainment, or associated
         | entertainment, and the communication related to that content.
         | 
         | > g. Photo editing that has an associated photo hosting
         | service, if the interaction with other users or account holders
         | is generally limited to liking or commenting.
         | 
         | > h. A professional creative network for showcasing and
         | discovering artistic content, if the content is required to be
         | non-pornographic.
         | 
         | > i. Single-purpose community groups for public safety if the
         | interaction with other users or account holders is generally
         | limited to that single purpose and the community group has
         | guidelines or policies against illegal content.
         | 
         | > j. To provide career development opportunities, including
         | professional networking, job skills, learning certifications,
         | and job posting and application services.
         | 
         | > k. Business to business software.
         | 
         | > l. A teleconferencing or videoconferencing service that
         | allows reception and transmission of audio and video signals
         | for real time communication.
         | 
         | > m. Shared document collaboration.
         | 
         | > n. Cloud computing services, which may include cloud o. To
         | provide access to or interacting with data visualization
         | platforms, libraries, or hubs.
         | 
         | > p. To permit comments on a digital news website, if the news
         | content is posted only by the provider of the digital news
         | website.
         | 
         | > q. To provide or obtain technical support for a platform,
         | product, or service.
         | 
         | > r. Academic, scholarly, or genealogical research where the
         | majority of the content that is posted or created is posted or
         | created by the provider of the online service, website, or
         | application and the ability to chat, comment, or interact with
         | other users is directly related to the provider's content.
         | 
         | > s. A classified ad service that only permits the sale of
         | goods and prohibits the solicitation of personal services or
         | that is used by and under the direction of an educational
         | entity, including:
         | 
         | > (I) A learning management system;
         | 
         | > (II) A student engagement program; and
         | 
         | > (III) A subject or skill-specific program.
        
           | jcranmer wrote:
           | > The definition seems narrowly tailored
           | 
           | The fact that are well over a dozen exceptions carved out
           | strongly suggests that the definition is anything but
           | narrowly tailored, and the authors of the bill preferred to
           | add in exceptions to everyone who objected rather than
           | rethinking their broad definitions.
           | 
           | 1a-c will be trivially satisfied by anything that "has user
           | accounts" and "allow users to comment". 1e is clearly meant
           | to cover "algorithmic" recommendations, but it's worded so
           | broadly that a feature that includes "threads you've
           | commented on" would satisfy this prong. 1d is problematic; it
           | can be interpreted so narrowly that nothing applies, or so
           | broadly that everything applies. IANAL, but I think you'd
           | have a decent shot of going after this for unconstitutionally
           | vague on this prong for sure.
           | 
           | Discounting 1d, this means that virtually every website in
           | existence qualifies as social media sites, at least before
           | you start applying exceptions. Not just Facebook or Twitter,
           | but things like Twitch, Discord, Paradox web forums, Usenet,
           | an MMO game, even news sites and Wikipedia are going to
           | qualify as social media platforms.
           | 
           | Actually, given that it's not covered by any of the
           | exceptions, Wikipedia is a social media platform according to
           | Florida, and I guess would therefore be illegal for kids to
           | use. Even more hilariously, Blackboard (the software I had to
           | use in school for all the online stuff at school) qualifies
           | as a social media platform that would be illegal for kids to
           | use.
        
             | soared wrote:
             | Agreed - they're effectively banning most commonly used
             | websites and then carving out exceptions.
        
             | scythe wrote:
             | >Discounting 1d, this means that virtually every website in
             | existence qualifies as social media sites
             | 
             | Most websites would not satisfy 1e. Hacker News, for
             | example. Traditional forums do not satisfy 1e.
             | 
             | >1e is clearly meant to cover "algorithmic"
             | recommendations, but it's worded so broadly that a feature
             | that includes "threads you've commented on" would satisfy
             | this prong.
             | 
             | There could be some haggling over this, but I don't think
             | that reading it in the least reasonable possible way is
             | likely to fly in court. In particular, 1e stipulates
             | "content offered". If "threads you've commented on" is
             | content that the user has to _request_ , e.g. by viewing a
             | profile page or an inbox, that might not be considered
             | "offering". It also says "control or target", but content
             | with a simple bright-line definition like that is probably
             | not controlled and certainly not targeted.
             | 
             | >The fact that are well over a dozen exceptions carved out
             | strongly suggests that the definition is anything but
             | narrowly tailored
             | 
             | >at least before you start applying exceptions.
             | 
             | Yes, the definition is excessively broad if you ignore the
             | majority of the text in the definition. This is a circular
             | argument.
             | 
             | >Actually, given that it's not covered by any of the
             | exceptions, Wikipedia is a social media platform
             | 
             | Exception 2m, shared document collaboration. But I don't
             | think Wikipedia satisfies 1e either.
             | 
             | >Blackboard (the software I had to use in school for all
             | the online stuff at school) qualifies as a social media
             | platform
             | 
             | Probably qualifies under 2s or 2m. I'm not familiar enough
             | with the platform to know if it satisfies 1e.
        
               | jcranmer wrote:
               | > Most websites would not satisfy 1e. Hacker News, for
               | example. Traditional forums do not satisfy 1e.
               | 
               | Hacker News has a page that lets me see all of the
               | replies from comments I've posted. Posting is clearly
               | "activity of an account holder", and that means there is
               | "at least part of the content" being "control[led]" by
               | that activity.
               | 
               | > If "threads you've commented on" is content that the
               | user has to request, e.g. by viewing a profile page or an
               | inbox, that might not be considered "offering".
               | 
               | You're the one who's criticizing me for "least reasonable
               | possible way", and you're trying to split hairs like
               | this? (FWIW, an example that would qualify under your
               | more restrictive definition is that the downvote button
               | is not shown until you receive enough karma.)
               | 
               | But at the end of the day, it doesn't matter what you
               | think, nor what I think, nor even what a court will
               | think. What matters is how much it will constrain the
               | government using this law to harass a website. And 1e
               | isn't going to provide a barrier for that.
               | 
               | > Yes, the definition is excessively broad if you ignore
               | the majority of the text in the definition. This is a
               | circular argument.
               | 
               | No, it's not. The definition boils down to "everything on
               | the internet is social media, except for these categories
               | we've thought of" (or more likely, had lobbyists who
               | pointed out that the definition included them and so we
               | through them into the list of exceptions). That the
               | majority of the text of the definition is a list of
               | exceptions doesn't make it a narrow definition; indeed,
               | it just highlights that the original definition sans
               | exceptions is overly broad.
               | 
               | > Probably qualifies under 2s or 2m
               | 
               | Definitely not 2s, it's not "a classified ad service".
               | I'm not sure there are any classified ad services are
               | "used by or under the direction of an educational
               | agency", but that's what you have to be to qualify under
               | 2s. (This makes me think someone did a bad job of
               | splicing in the exception, and the second half of 2s is
               | supposed to be 2t. Just goes to show you the level of
               | quality being displayed in the drafting of this bill, I
               | suppose.)
        
               | scythe wrote:
               | >Hacker News has a page that lets me see all of the
               | replies from comments I've posted. Posting is clearly
               | "activity of an account holder", and that means there is
               | "at least part of the content" being "control[led]" by
               | that activity.
               | 
               | It doesn't say "control" by the activity, it says control
               | by the _platform_. Control is a little bit difficult to
               | define, but one reasonable necessary condition is that if
               | someone is in control of something, then someone else
               | might do it differently.  "Show me threads I've commented
               | on" should produce the same result regardless of
               | platform. "Show me a random page" should at least have
               | the same probability distribution. But "my
               | recommendations" is fully under the platform's _control_.
               | 
               | I grant that trying to interpret "offer" was a bit of a
               | reach on my part. On the other hand, "control" can be
               | interpreted in a pretty reasonable way to imply some form
               | of meaningful choice.
               | 
               | >You're the one who's criticizing me
               | 
               | I'm not criticizing you, I'm criticizing your argument.
               | It's important to keep a safe emotional distance in this
               | kind of discussion.
               | 
               | >it doesn't matter [...] even what a court will think.
               | What matters is how much it will constrain the government
               | using this law to harass a website. And 1e isn't going to
               | provide a barrier for that.
               | 
               | It certainly does matter what a court will think. Once a
               | couple of precedents are set, it should be possible to
               | identify what legal actions are meritless, and
               | governments bringing frivolous suits may find themselves
               | voted out of office. Unfairly targeted websites can bring
               | a claim of malicious prosecution.
               | 
               | Now, if you don't trust the voters and the courts, that's
               | a different issue, but it's going to affect every law,
               | good or bad. That's just how government works.
               | 
               | >That the majority of the text of the definition is a
               | list of exceptions doesn't make it a narrow definition;
               | indeed, it just highlights that the original definition
               | sans exceptions is overly broad.
               | 
               | If we assume that ad targeting and algorithmic content
               | recommendation are profitable, the original definition
               | clearly constrains the ability of sites to make money
               | while offering user accounts for minors. Lobbyists
               | probably don't want profits constrained for their
               | employers, even if the targeting aspects aren't
               | necessary.
               | 
               | But just because it targets _features that can be added_
               | to every website, it isn 't reasonable to say that it
               | targets every website. Most of the Internet functions
               | just fine without needing to create accounts, and when
               | accounts are necessary, they're for buying stuff.
        
           | ramblenode wrote:
           | Thanks for posting the details.
           | 
           | > f. Interactive gaming, virtual gaming, or an online service
           | 
           | This bill is already out of date. The new generation's social
           | media are games like Roblox. And these are as addictive as
           | the old social media.
           | 
           | Good luck with this whack-a-mole. A comprehensive bill would
           | stop this at the source: kids owning smartphones. But
           | addressing smarphones would upset too many parents and too
           | much business, so it won't get done.
        
       | tokai wrote:
       | Its impressive how people here, that should know better, are
       | cheering this on - just because they are parents themselves. Even
       | disregarding the legality, its not going to work technically. The
       | perceived safety of ones kids really crosses some wires in
       | parents.
        
         | efields wrote:
         | It's good to have this conversation. I don't think anyone here
         | wants the bill enacted as-is (did anyone actually read it? I
         | didn't, and I don't live in FL).
         | 
         | Of course this is technically unenforceable. There will always
         | fbe workarounds. You could smoke as an 11 year old if you were
         | determined enough.
         | 
         | But we need more pushback and dialogue on social media's role
         | in the common discourse. For a while, nobody talked about
         | smoking being bad and it became normalized while killing a lot
         | of people. Seatbelts. Dumping waste in the river... most people
         | go with the flow of common consensus until that consensus is
         | scrutinized.
        
         | syrgian wrote:
         | It would definitely work if a parent could report the accounts
         | of all the kids involved and their tiktok/whatever accounts got
         | deleted + their phone numbers and emails got block-listed for
         | the service.
         | 
         | Even more if the system worked in a centralized way: this
         | email/phone is used exclusively by a kid, so now all on-boarded
         | companies must delete their accounts and not allow them to
         | register again.
        
           | voidwtf wrote:
           | Herein lies the rub, every time I've seen this happen in the
           | past companies just applied blanket wide bans to accounts.
           | Sometimes retroactively for accounts that were illegal at the
           | time of their creation regardless of the users current age.
           | Both Google and Twitter did this to users who created
           | accounts before they were 13 but were then adults.
           | 
           | If you're going to introduce legislation like this, then it
           | needs to include provisions that it will not permanently bar
           | those users access to the services once they are of age. I
           | manage my children's social media interaction (near 0 with
           | the exception of YouTube), if their accounts get permanently
           | disabled that would be unfortunate in the future when they
           | are old enough.
        
         | semiquaver wrote:
         | I'm a parent and think this is bullshit. It won't work and it
         | isn't designed to work, it's just political posturing destined
         | to be struck down by courts.
        
       | exabrial wrote:
       | I fail to see how anyone under 18 can legally agree to any kind
       | of contract without a parental co-signature. This should be
       | enforceable without new laws, but I'm glad to make it explicit:
       | progress over perfection.
        
       | rpmisms wrote:
       | This is good. Seriously, the damage it does is massive, kids
       | should not be on Instagram, Snapchat, or Tiktok.
        
         | ibejoeb wrote:
         | I have a hunch that you're right, but there are a few other
         | things that coincided with the rise of social media, so it's
         | hard to tell what has driven the change in kids' psyches. Among
         | them is the explosion of pharmaceuticals that are routinely
         | prescribed to kids.
        
           | rpmisms wrote:
           | I think these things are linked. Either way, kids are having
           | serious issues, so let's roll back the clock and see what
           | innovation causes it. Kids were fine before social media,
           | arguably better.
        
           | owisd wrote:
           | The pharmaceuticals thing is unique to the US, but every
           | country has seen a simultaneous rise in kids mental health
           | issues, so that rules out pharmaceuticals.
        
         | anshumankmr wrote:
         | What about an Instagram for Kids type thing?
         | 
         | Like there is YouTube for kids,
         | 
         | Make it ad-free (I know this is a pipe dream, but at the very
         | least it should have ads from companies that do not sell
         | gambling apps), scrolling time limits (that actually work which
         | should be a bare minimum), chronological time feed
        
           | rpmisms wrote:
           | No. I'm not talking about the content itself, I'm talking
           | about the firehose of useless garbage, appropriate or not,
           | that's frying their dopamine receptors.
        
           | ryandvm wrote:
           | Great idea. We could also give them clove cigarettes until
           | they're old enough to smoke!
        
           | dartharva wrote:
           | NO. The base concept of Social Networks by itself is harmful.
           | It does not matter how much you "improve" those things.
        
       | yieldcrv wrote:
       | control behavior by _regulating the intermediary_
       | 
       | this is a strategy that works under any governance system on the
       | planet. so to actually make an enforceable law don't try to
       | impose restrictions on the action, provider, or users of the
       | action, instead you should think about things they rely on and
       | restrict those.
       | 
       | this law doesn't do that. but it would be fun to think of things
       | that would. can we take something away from social media users?
       | can we incentivize non-social media users? maybe we can leverage
       | our disdain of data brokers into a partnership where data brokers
       | can't serve companies that have children's data
       | 
       | just spitballing, I don't actually care about this law or any of
       | those proposals, just noticing the current state of the
       | discussion lacks... _inspiration_.
        
       | tapoxi wrote:
       | Is Hacker News considered social media?
        
         | riffic wrote:
         | yes
        
         | tzs wrote:
         | For this bill, no. It does not meet conditions (d) and (e) of
         | the 5 conditions a site must meet in order to be a social media
         | platform. I've quoted the relevant part of the latest bill text
         | in this prior comment [1].
         | 
         | [1] https://news.ycombinator.com/item?id=39176380
        
       | dvngnt_ wrote:
       | i don't know why people are cheering for this?
       | 
       | the government shouldn't take the place of the parents.
       | 
       | I think the real reason is they don't like how tiktok is creating
       | activists out of the next generation of voters.
       | 
       | we should demand better from big tech but banning does nothing to
       | improve platforms.
       | 
       | so many adults here were on forums, 4chan, mmos, ventrilo growing
       | up, even Facebook which was useful keeping in touch as a kid.
        
         | causi wrote:
         | All those things were horrible for us. I can't imagine the
         | nightmare of combining that with the use of my real name. Like
         | yeah, I _want_ a free for all, but how bad does something have
         | to be before we say no way?
        
           | tstrimple wrote:
           | > All those things were horrible for us.
           | 
           | Except that they weren't. My life would actually be
           | substantially worse now without most of those apps /
           | websites. I was just another loser growing up in a rural
           | trailer park with no real prospects before I got interested
           | in programming and taught myself employable skills in those
           | online forums and chat apps. It's insane to me that anyone
           | could call themselves a part of a "hacker" community and
           | complain about kids having access to information and wanting
           | legislation to restrict it.
        
         | mminer237 wrote:
         | I agree it's the parents' responsibility, but I don't know that
         | the government can't prevent things it deems harmful just
         | because parents are okay with it. It also often makes things
         | easier for parents by being able to require service providers
         | to work with parents and limit their sales to children.
         | 
         | I'm fine with the government enforcing curfews, smoking, and
         | drinking laws on children even if their parents disagree.
         | 
         | I don't think any of the things you listed, except probably
         | Facebook would qualify as a social medium under this law. I'm
         | honestly very thankful more manipulative social media didn't
         | exist when I was a teen. My life probably would have been
         | better if I didn't have access to as many as I did even.
        
       | joshuaheard wrote:
       | I support this. Normally, I think government intervention is bad
       | and parents should be in control. But, as a parent myself, it was
       | hard not to allow my child to have a phone when she kept saying,
       | "Everyone has one. I'll be the only one without a phone". No one
       | wants their child left out or left behind. This will remove that
       | rationale.
        
         | frumper wrote:
         | This wouldn't stop your kid from wanting a phone. They'll all
         | still want phones for games, chatting, cameras, videos, comment
         | sections, forums. Of course many parents won't care if their
         | kids sign up to social media sites through some workaround, so
         | the pressure will still be there to join social media sites.
        
         | tstrimple wrote:
         | Just to be clear, because parenting is hard you want to
         | legislate how other parents are able to raise their children so
         | that it's easier for you to get the behavior you want? I think
         | all parents should take this approach.                 * Having
         | trouble getting your children to go to church because their
         | friends don't? Let's just legislate mandatory church attendance
         | so that will remove the rationale for kids whose friends don't
         | attend!             * Having trouble getting your kids to eat
         | healthy because all their friends get to eat and drink whatever
         | they want? Let's outlaw sodas so kids won't have to feel peer
         | pressure!            * You think your kid is playing too many
         | video games? Why not just pass legislation that restricts all
         | video game usage so your kid doesn't feel left out!
         | 
         | Telling your kids no is part of being a parent. Explaining to
         | your children why they aren't allowed to do some things that
         | other kids do is part of being a parent. It seems we have an
         | abundance of parents who don't want to actually be a parent and
         | would rather legislation was passed so they don't have to say
         | no to their kids.
        
           | joshuaheard wrote:
           | It's not my deficiencies as a parent that make me support
           | this law. It's the need for a reverse network effect. That's
           | how social media works. If everyone else has it, your kid
           | wants it. If no one else has it, your kid doesn't want it.
           | Social media has been found harmful to children, like smoking
           | or alcohol. For many reasons, it should be limited for
           | children.
        
       | TheCaptain4815 wrote:
       | I don't agree with any type of "outright ban". However, having
       | EXTREME restrictions on these social media sites for children
       | seems so obvious. I'd prefer a complete restriction on any
       | content outside of friends groups, any algorithm restriction,
       | etc.
        
       | bilsbie wrote:
       | I don't know where I saw this idea but instead of banning just
       | force these companies to make their feed algorithms open source.
       | 
       | It would be much less heavy handed and would freedom increasing
       | instead of decreasing.
       | 
       | It would work because There would be enough outrage over seeing
       | nefarious topics being pushed that the companies would refrain.
        
         | mindslight wrote:
         | I agree with where you're coming from, but publicly documenting
         | their feed algorithms (which is what a call for "open source"
         | effectively is) wouldn't change much. What is actually needed
         | are open API access to the data models, so that competitive
         | non-user-hostile clients can flourish.
         | 
         | I believe this would be legally straightforward by regulating
         | based on the longstanding antitrust concept of _bundling_ -
         | just because someone chooses use to use Facebook 's data
         | hosting product, or their messaging products, does not mean
         | they should also be forced to use any of Facebook's proprietary
         | user interface products.
         | 
         | This would not solve the collective action problem where it's
         | hard for an individual parent(/kid) to individually reject
         | social media, but I also don't see this bill doing much besides
         | making it so that kids have to make sure their fake birthday is
         | three years earlier than the one they already had to make up
         | due to COPPA. Of course the politicians pushing this bill are
         | likely envisioning strict identity verification to stop that,
         | but such blanket totalitarianism should be soundly rejected by
         | all.
         | 
         | Unfortunately the larger problem here is that the digital
         | surveillance industry has been allowed to grow and fester for
         | decades with very few constraints. Now it's gotten so big and
         | its effects so pervasive that none of the general solutions to
         | reigning it in (like similarly, a US GDPR) are apparent to
         | politicians. It's all just lashing out at symptoms.
        
         | dartharva wrote:
         | Is the feed algorithm the only problem that is harming
         | children? Not the concept of a social network in general, the
         | entire point of whose is to publicise lives and keep its users
         | stuck onto their screens for as much time as possible?
         | 
         | The entire fault of social networks is that it is hampering
         | children's development by keeping them online. Trying to
         | improve those services by making the algorithms better will
         | _worsen_ the situation. You want to make children lose the
         | appeal of social media, not increase it!
        
       | manicennui wrote:
       | Perhaps a better solution is to allow children on social media,
       | but drastically limit how companies are allowed to interact with
       | such accounts. Random ideas: no ads, no messages from people they
       | don't follow, limit ability to follow other accounts in some way.
        
         | djfdat wrote:
         | Can I have that as an adult please?
        
         | dartharva wrote:
         | Children are not as much at danger from falling for online
         | advertisements as they are for the overall detrimental effects
         | of social media in general. Social networks are _inherently_
         | bad for kids; they are addictive and directly harm children by
         | constantly dosing them with dumb entertainment and cutting off
         | their attention spans.
        
       | tamimio wrote:
       | This is idiotic, they will find a way to watch and be on social
       | media regardless. Also, I don't see how social media is bad but
       | MSM that brainwashed generations is any good, are they going to
       | enforce the same rules on other forms of media? Or is it because
       | "we can't censor XY social media" so we are gonna ban them all?
        
       | mullingitover wrote:
       | Florida House essentially breaking out the classic: "We're from
       | the government and we're here to help."
        
       | HumblyTossed wrote:
       | > The social media platforms the bill would target include any
       | site that tracks user activity, allows children to upload content
       | or uses addictive features designed to cause compulsive use.
       | 
       | So, is ClassLink exempt? This seems pretty broad.
        
       | ecocentrik wrote:
       | This might be the first bill approved by the Florida legislature
       | in the last 8 years that I agree with. I like it in spite of all
       | the reasons these people voted for it. And I like it in spite of
       | the absolute horror show of an enforcement dilemma it's going to
       | impose on the residents of Florida. Will it force all Florida
       | residents including children to use VPNs to use the internet? Yes
       | it will.
        
       | arrosenberg wrote:
       | Social media exclusively uses a predatory pricing model and the
       | companies should be forced to stop subsidizing their products
       | with targeted ads. The algorithms drive max engagement because
       | that drives max impressions and cpcs. The algorithm doesn't care
       | that its' making people angry and divisive - it's optimizing
       | revenue!
       | 
       | All of the other evils stem from this core issue - Meta, et al.
       | makes money off of their customers misery. It should hardly
       | surprise anyone that children are affected much more strongly by
       | industrialized psychology.
        
       | brocklobsta wrote:
       | This is a very complex problem. Its not just social media, but
       | porn and other content that is not intended for young eyes.
       | 
       | One issue I have is with the age verification system. This will
       | either be totally ineffective or overly effective and invasive. I
       | feel legislation is drifting towards the latter with the
       | requirement of ID.
       | 
       | One idea I had is a managed dns blacklist of inappropriate
       | content. The government can have a requirement that a website
       | register their site in this list to operate, otherwise they are
       | subject to litigation for their content. At the same time have
       | isps and network gear support this list in a 1 click type
       | fashion. I have multiple dns blacklists I use at home. I know
       | this may be a little more technical for the parents and
       | guardians, but that is the world we are living in.
       | 
       | Limitations being:
       | 
       | Section 230 - user posts explicit content and the site isn't in
       | the blacklist.
       | 
       | Network scope - This blacklist will have to be added to all
       | networks accessed by children. What about public wifi? coffee
       | shops?
       | 
       | IDK, I love being able to be anonymous online, but I do see the
       | negative effects of social media, porn, and explicit content on
       | our youth. I don't really trust the government to solve this
       | effectively.
        
         | cheschire wrote:
         | What qualifies for the blacklist? It's a moral question. What
         | happens when the blacklist maintainer's morals differ from your
         | own? Sure, in the U.S. it seems fairly uniform that most people
         | do not want children having access to porn. But what about
         | women having access to information about abortion? Or
         | information about suicide? The use of drugs on psychological
         | conditions? Vaccination efficacy?
         | 
         | Really sucks when someone that controls the blacklist decides
         | you're on the moral fringes of society.
        
           | brocklobsta wrote:
           | Agreed, some are easy to classify like nude content, but what
           | about a website about war history, that content is
           | simultaneously factual and explicit. This is why I say its up
           | to the website owner to register for the blacklist. That in
           | itself has an incentive to reduce the surface of liability of
           | the website.
           | 
           | Social media is a hard problem. What is exactly the issue? Is
           | it creating a larger social hierarchy than children can cope
           | with? Is it meeting and interacting with strangers? Is it
           | reinforcing dopaminergic pathways from superficial digital
           | content and approvals?
        
         | 2OEH8eoCRo0 wrote:
         | I think that online anonymity is overrated (and yes, I'm aware
         | of my username). Social media platforms ought to require
         | traceability and age verification.
        
         | kayodelycaon wrote:
         | I don't even want the blacklist idea floated.
         | 
         | In a few states, you'll imminently see any information about
         | LGBTQ+ people, include mental health resources, on that
         | blacklist. (This has already been the case for decades in
         | various school districts.)
         | 
         | And I'm not even trying to exaggerate. Ohio is working to
         | eliminate any transgender medical treatment from the state.
         | They've already succeed in making it nearly impossible for
         | minors and now they are working on preventing adults from
         | receiving hormone treatments.
        
       | carabiner wrote:
       | It's crazy how in 2017, YC was proposing a social network for
       | kids as a startup idea:
       | 
       | > Social Network for Children: This is a super tough nut to
       | crack. You have to be cool and offer a safe and private
       | environment for kids to communicate with each other while
       | enabling trusted adults to peer in/interact, etc... The company
       | that can build something that is used and useful for all parties
       | can build something of great value throughout a person's entire
       | life. - Holly Liu, Kabam/Y Combinator
       | 
       | https://www.ycombinator.com/blog/13-startup-ideas
       | 
       | There was very little notion that a social network, no matter how
       | safe, was inherently detrimental to childhood development. Like
       | cigarettes initially, it just seemed to be mostly positive with
       | some niggling annoyances. I wonder what other current YC ideas
       | will be considered horrible 7 years from now.
        
       | deadbabe wrote:
       | I have no idea how this would be enforced but I agree with the
       | spirit of the law.
       | 
       | We either live in a world where children are hopelessly pressured
       | into joining social media early in life and suffering its
       | effects, or we ban them from it all together and allow them to
       | have something that still looks like a childhood.
        
       | einpoklum wrote:
       | I'm willing to buy an argument that certain kinds of "social
       | media" have negative impact on kids under 16; but I'm absolutely
       | not willing to buy an argument for a world in which the
       | government is able to ban your communications with other people
       | because you're under 16.
        
       | nerdjon wrote:
       | I feel like I can almost guarantee that this bill has nothing to
       | do with protecting children and has more to do with brainwashing
       | children and restricting their access to opposing viewpoints,
       | especially given this is Florida.
       | 
       | That being said, I am not strictly opposed to a bill like this.
       | But 16 is way too old. I feel like likely somewhere within the 10
       | to 13 range since most don't allow for under 13 anyways would be
       | fine. But then if they all block under 13 what is the point of
       | the bill?
        
         | mrangle wrote:
         | Restricting social media use is tantamount to brainwashing? I
         | don't see the connection.
         | 
         | As for restricting access to "opposing" (opposition to what?)
         | viewpoints, what children can be exposed to has long been
         | restricted.
         | 
         | But since there isn't a syllabus for what children will be
         | presented on social media, I don't see the viewpoint
         | restriction angle either.
         | 
         | In fact, that position is illogical to the point that it raises
         | the question of whether or not people concerned with it have an
         | agenda to expose kids to "viewpoints" that their parents would
         | disapprove of. Under the radar of supervision.
         | 
         | Going only on my experience with social media, a valid and more
         | plausible reason for this restriction would be that social
         | media seeks to optimize the feed of users for engagement. In a
         | manner that "hacks" psychology in a way that makes it difficult
         | for even adults to disengage. Given that minors do not have
         | fully developed brains, the ability to disengage may be even
         | more hindered.
         | 
         | Television programing has long sought this goal as well and
         | with some success. While that use isn't restricted, there is
         | theoretically a red line. Florida may see it in social media
         | use.
        
           | netsharc wrote:
           | How TikTok users trolled the Trump campaign:
           | https://www.cbsnews.com/news/trump-rally-tiktok-crowds-
           | tulsa...
           | 
           | Well, true, TikTok probably has more negatives than
           | positives, but I have a feeling the American Talibans[1] in
           | power don't like teens organizing, and where do they
           | organize? On social media...
           | 
           | [1] Yes this is an apt comparison. Suppression of opposing
           | viewpoints, growing voter suppression and not even accepting
           | results of democratic elections, and then the whole anti-
           | Abortion movement.
        
         | holoduke wrote:
         | Its should be categorized in the same way as gambling. Its
         | addictive and useless in any form. The whole world would be
         | better off without any social media. Including the most anti
         | social people.
        
           | nerdjon wrote:
           | I disagree, what social media has turned into thanks to
           | algorithms and engagement is a problem.
           | 
           | But in its purest form Social media isn't a bad thing and is
           | a good way to actually keep in contact with friends. Also a
           | good way to keep up on events happening around the world
           | without relying on the news for everything.
        
           | RoyalHenOil wrote:
           | Do you realize that this is a social media site?
        
         | dylan604 wrote:
         | > what is the point of the bill?
         | 
         | This is a valid question for pretty much all legislation. It
         | serves by allowing the congress critters to toot their horns as
         | doing something for those that only pay attention to news bites
         | while doing no harm by doing nothing
        
           | nerdjon wrote:
           | I really hate how right you are. And that meaningless thing
           | will be all over ads and or your opponent voted against it
           | (since it was meaningless and shouldn't be on the books) that
           | turns into an attack on them.
        
           | throwup238 wrote:
           | I feel like the Australian TV show Utopia should be mandatory
           | viewing for anyone who wants to understand government, even
           | though it is ostensibly a comedy.
        
       | geogra4 wrote:
       | I remember when I was 10 years old at a computer camp during the
       | summer at a local college. They had me set up my first email
       | account with hotmail. They all asked us to lie about our age. I
       | think even then they had restrictions that you had to be 13 years
       | old.
       | 
       | But - that was over 25 years ago. The internet was a much
       | different place.
        
         | CalRobert wrote:
         | Might be thinking of this?
         | https://en.wikipedia.org/wiki/Children's_Online_Privacy_Prot...
         | 
         | I remember joining ebay (well, auctionweb - aw.com/ebay, IIRC)
         | and it not even being an issue that I was around 14, we mostly
         | trusted each other, and just mailed money orders around. A
         | different time.
        
       | nojvek wrote:
       | What if the social network is created by kids?
       | 
       | I guess I need to write to my representatives.
       | 
       | It's funny that the Republicans tout they are the party of
       | "freedom", yet restrict the liberties.
       | 
       | Why not let parents decide how they want to raise their kids?
        
       | soared wrote:
       | Technical implementation looks like a good use case for this,
       | morals/etc aside -
       | 
       | https://developers.google.com/privacy-sandbox/protections/pr...
       | 
       | Private State Tokens enable trust in a user's authenticity to be
       | conveyed from one context to another, to help sites combat fraud
       | and distinguish bots from real humans--without passive tracking.
       | 
       | An issuer website can issue tokens to the web browser of a user
       | who shows that they're trustworthy, for example through continued
       | account usage, by completing a transaction, or by getting an
       | acceptable reCAPTCHA score. A redeemer website can confirm that a
       | user is not fake by checking if they have tokens from an issuer
       | the redeemer trusts, and then redeeming tokens as necessary.
       | Private State Tokens are encrypted, so it isn't possible to
       | identify an individual or connect trusted and untrusted instances
       | to discover user identity.
        
         | jvanderbot wrote:
         | Yep - one of those "it's possible but do we want this"
         | situations. Something feels a bit slimy about government-
         | approved browser tokens. Like,
         | 
         | "We're sorry, your revocation appeal is taking longer than
         | expected due to ongoing unrest in your area - please refrain
         | from using internet enabled services like ordering food,
         | texting friends, uploading livestream videos of police, giving
         | legal advice, finding directions to your employer - a nonprofit
         | legal representation service, or contacting high-volume
         | providers like the ACLU. Have a nice day!"
         | 
         | But it could just be "Please execute three pledges of
         | allegiance to unlock pornhub"
        
         | solarpunk wrote:
         | you've posted this in a few threads, but i dont think i
         | understand what the scenario it is used in would be?
         | 
         | every user of social media in florida now has to visit a third
         | party (who?) that sets a cookie (private state token?) on their
         | browser that verifies their age?
        
           | soared wrote:
           | Correct - ISP requires you to visit Florida.gov (or
           | realistically a company the government trusted to set up
           | verification) to set your token if you're an adult. Then each
           | social media site checks whether a visitor is from Florida,
           | and then if they have a valid token. If valid, load like
           | normal. If not valid, don't load the site.
        
             | Tadpole9181 wrote:
             | And now the state of Florida has a receipt of every website
             | you ever visit. That will surely _never_ be an issue when
             | the Governor 's private law enforcement arm looks through
             | it or the inevitable data leak happens.
        
               | soared wrote:
               | The intention of the API is for that to not be possible.
               | 
               | > The privacy of this API relies on that fact: the issuer
               | is unable to correlate its issuances on one site with
               | redemptions on another site. If the issuer gives out N
               | tokens each to M users, and later receives up to N*M
               | requests for redemption on various sites, the issuer
               | can't correlate those redemption requests to any user
               | identity (unless M = 1). It learns only aggregate
               | information about which sites users visit.
               | 
               | https://github.com/WICG/trust-token-api?tab=readme-ov-
               | file#p...
        
       | tempestn wrote:
       | 16 seems ridiculously old for such a ban. Especially if sites
       | like YouTube count as social media (and I can't imagine how it
       | wouldn't, given most of the content is identical between the
       | short video platforms).
        
         | ChrisLTD wrote:
         | Why does 16 seem ridiculously old?
        
           | tempestn wrote:
           | Because 15 year-olds are the prime audience for a lot of
           | social media. And while much of it is a waste of time at best
           | or actively harmful at worst, a lot of it is also engaging or
           | even educational content that high school students should be
           | allowed to access. And will access regardless; it'll just be
           | a hassle for them or their parents to have to jump through
           | hoops to get to it.
        
       | jay_kyburz wrote:
       | A much better better move in my local state, they banned all
       | phones in school up to year 10.
       | 
       | https://www.act.gov.au/our-canberra/latest-news/2023/decembe...
        
       | TriangleEdge wrote:
       | What's an enforcement mechanism that works for this? My concern
       | is that teenagers are going to learn yet another way to be
       | dishonest.
        
       | iosystem wrote:
       | The tech industry needs an association similar to how doctors
       | join the American Medical Association, where they can
       | collectively agree on ethics and guidelines that must be
       | followed. Any person in tech is behaving unethically if they
       | assist in implementing software to restrict children in Florida
       | from accessing information on the internet that their peers in
       | other states can access. Florida shows little concern for the
       | potential harm to children resulting from information
       | restrictions. Kids in abusive environments greatly benefit from
       | the social connections online communities provide, as well as the
       | diverse information and perspectives from other people. Florida
       | has created this bill as a means to censor content it deems
       | immoral, whether it be abortion information for girls,
       | understanding sexuality, the existence of trans kids, or any
       | other topic arbitrarily designated as immoral by the ruling
       | political party. It is disconcerting to target the rights of
       | children, who have the lowest chance of having the resources
       | needed to challenge something like this bill in court, which
       | should happen under the first amendment.
        
         | tmpz22 wrote:
         | The AMA has many problems but Ill just mention one - they
         | artificially restrict the number of graduates each year _WHILE_
         | reporting nation wide shortages. Its not a silver bullet for
         | ethical behavior or efficient economics.
        
       | 0xbadcafebee wrote:
       | Great! Now can we ban it for those over 16?
        
       | spdustin wrote:
       | I truly don't understand how the party of small government and
       | parental rights of determination can justify passing this.
       | 
       | Perhaps more relevant to HN, it seems to me that any solution
       | here would be a dramatic loss of agency, privacy, and anonymity.
        
       | bentt wrote:
       | I am here to report as a parent of children that I have been able
       | to keep them off of social media by not getting them phones,
       | controlling the Internet in the house, and paying close
       | attention.
        
         | munificent wrote:
         | How has it impacted their social life?
         | 
         | My experience with my kids (middle and high school age) is that
         | online is "where" most kids socialize today and if I don't let
         | my kids go there, I am socially isolating them.
        
       | redder23 wrote:
       | I am all for it but the question is how will this be enforced? If
       | they use this to require government ID on every other website now
       | and crack down on semi anonymous accounts and go full
       | surveillance I do not like it.
       | 
       | You might say "they do know you are when you make any account"
       | and it might be true, well if I would use a VPN all the time and
       | really no let any info slip maybe not. I just to not like the
       | total deletion of privacy.
        
       | jokethrowaway wrote:
       | I don't like it's mandated by law (id check on the internet is a
       | identity fraud disaster waiting to happen), but I'm copying this
       | for my kids
        
       | kmeisthax wrote:
       | My impression of these bills was that none of them had survived
       | contact with SCOTUS. What is going to make this bill any
       | different?
       | 
       | And why aren't we just passing a proper damned privacy law
       | already? All of the addictive features of social media are
       | specifically enabled by their ability to collect unlimited
       | amounts of data. Yes, I know, the NSA really likes that data
       | thank-you-very-much, but SCOTUS isn't going to be OK with "censor
       | children" to protect that data flow.
        
       | aktuel wrote:
       | I would ban it for everyone under 21. It is certainly not safer
       | than alcohol.
        
       | swozey wrote:
       | I wonder what this would affect culturally. There is a LOT more
       | to this that will happen than just keeping children off of social
       | media.
       | 
       | The USA exports its culture/pop/etc all over the world. I don't
       | follow teenager arts/music/etc sources but a lot of musicians
       | start in middle school and have so many mix tapes online and get
       | known around their cities from using social media. Artists find
       | other artists, learn other styles, etc.
       | 
       | I got into programming through IRC as a kid, which maybe that's
       | like tiktok nowadays, I don't know a good comparison. I learned
       | so much through sites/apps I could "upload content" and "tracks
       | user activity."
       | 
       | So, what happens when every kid-teenager in a nation thats the
       | worlds biggest culture exporter isn't getting their culture out?
       | 
       | I can't believe this got 106 to 13 with the "Regardless of
       | parental approval."
        
       | financypants wrote:
       | This is such a silly issue on a governmental level. Shouldn't the
       | parents, who spend more time around their children than the
       | Florida House of Representatives does, worry about and monitor
       | their own children?
        
         | it_citizen wrote:
         | Should we apply the same reasoning for banning kids from buying
         | alcohol or firearms then?
        
         | arijun wrote:
         | Off the top of my head, I can think of two reasons why it might
         | be preferable to have the government intervene.
         | 
         | 1) We are happy to have the government intervene in other cases
         | for the sake of children; I would be pretty upset at any
         | politician who espoused removing age restrictions on
         | cigarettes. I don't know that social media is as bad but it
         | certainly has some of the addictive properties
         | 
         | 2) An argument from tragedy of the masses: if all kids' social
         | lives currently revolve around social media, unilaterally
         | disallowing one child to use it could result in alienation from
         | their peer group, which might be worse for the kid than social
         | media. A government mandate would remove this issue.
        
       | paxys wrote:
       | Is this a real bill or another one of those performative ones
       | that they know will get deemed unconstitutional by a court?
        
       | int0x21 wrote:
       | Ahh yes. When you can't parent, let the government do it.
        
       ___________________________________________________________________
       (page generated 2024-01-29 23:01 UTC)