So, You've Hit an Age Gate. What Now?

(eff.org)

184 points | by hn_acker 2 hours ago

29 comments

  • ryandrake 1 hour ago
    My kid has recently just quit playing Roblox because of the sketchy facial age check process. She said that her and all her friends know not to ever upload a picture of themselves to the Internet (good job, fellow Other Parents!!) so they're either moving on to other games or just downloading stock photos of people from the internet and uploading those (which apparently works).

    What a total joke. These companies need to stop normalizing the sharing of personal private photos. It's literally the opposite direction from good Internet hygiene, especially for kids!

    • btown 1 hour ago
      One aspect of this normalization of photo uploading is that, if a platform allows user-generated content that can splash a modal to kids, a bad actor can do things like say “you need to re-verify or you’ll lose all your in-game currency, go here” and then collect photo identification without even needing to compromise identity verification providers!

      I truly fear the harm that will be done before legislators realize what they’ve created. One only hopes that this prevents the EU and US from doing something similar.

      • kspacewalk2 51 minutes ago
        The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y? Is the harm done significant enough to warrant providing parents with a technical solution for giving them control over which sites their X-aged child signs up, and a solution that like actually works? Obviously pinky-swear "over 13?" checkboxes don't work, so this currently does not exist.

        You can work through robustness issues like the one you bring up (photo uploading may not be a good method), we can discuss privacy trade-offs like adults without pretending this is the first time we legitimately need to make a privacy-functionality or privacy-societal need trade-off, etc. Heck, you can come up with various methods where not much privacy needs trading off, something pseudonymous and/or cryptographic and/or legislated OS-level device flags checked on signup and login.

        But it makes no sense to jump to the minutiae without addressing the fundamental question.

        • array_key_first 36 minutes ago
          The real solution, IMO, is a second internet. Domain names will be whitelisted, not blacklisted, and you must submit an application to some body or something.
          • anon84873628 23 minutes ago
            I agree. There were attempts to do something like this with porn sites via the .xxx TLD I believe, but that inverts the problem. Don't force the public to go to a dark alley for their guilty pleasures. Instead, the sites that want to target kids need to be allowlisted. That is much more practical and palatable.
            • tracker1 17 minutes ago
              Yeah.. the opposition was just a bad take IMO... "but it will create a virtual red light district" which is EXACTLY what you want online, unlike a physical city, you aren't going to accidentally take a wrong turn, and if you're blocking *.xxx then it's even easier to avoid.

              Then require all nudity to be on a .edu, .art or .xxx, problem mostly solved.

              • MarsIronPI 5 minutes ago
                > Then require all nudity to be on a .edu, .art or .xxx, problem mostly solved.

                Who's doing the requiring here? Sounds like yet another path to censorship dystopia.

          • goopypoop 26 minutes ago
            sounds like an app store
        • anal_reactor 6 minutes ago
          It's never been about porn. By marking certain part of the internet "adult-only" you imply that the rest is "family-friendly" and parents can feel less bad about themselves leaving their children with iPads rather than actually parenting them, which is exactly what Big Tech wants for obvious reasons. If I had a child I'd rather have it watch porn than Cocomelon, which has been scientifically developed so that it turns your child's brain into seedless raspberry jam. Yet nobody's talking about the dangers of that, because everyone's occupied with <gasp> titties.
      • jofla_net 31 minutes ago
        i call this slipstreaming, it can even occur during the signup yeah, once the bouncing around to many domains / uploading photos is psychologically normalized havoc can ensue. this is the greater evil.
      • pfraze 1 hour ago
        I’m sorry to say that a number of US states have instituted age verification laws over the past year
        • pixl97 53 minutes ago
          Aka, morality laws mostly.
      • ryandrake 9 minutes ago
        I'm optimistic actually. I think "Gen Alpha" is gonna be alright and sufficiently wary of Internet sharing and privacy. Unlike the previous few generations, esp. Milleneals and to a somewhat lesser extent Gen Z and Boomers, who have massively over-shared and are now reaping some of the horrible harvest that comes from that oversharing. Today's teens and tweens seem to finally be getting the message.

        I also actually think AI might be a savior here. The ability to fake realistic 18+ year old selfies might help put the nail in the coffin of these idiotic "share a photo with the Internet" verification methods.

    • turblety 1 hour ago
      There seems to be a big movement (UK specifically) from governments using age gateing as an excuse to increase surveillance and online tracking. I don't know where Roblox is based or it's policies, but it's likely they are just implementing what the government has forced them to do.

      We need to push back against governments that try and restrict the freedom of the internet and educate them on better regulations. Why can sites not dictate the content they provide, then let device providers provide optional parental controls.

      Governments forcing companies to upload your passport/ID, upload pictures/videos of your face, is dangerous and we are going to see a huge increase of fraud and privacy breaches, all while reducing our freedoms and rights online.

      • anon84873628 19 minutes ago
        IMO it should not be hard for large services like Roblox and Instagram to get together with device makers to come up with a sensible solution.

        When you create a new profile on Netflix you mark it as "kids" and voila. Devices should have kid profiles with lots of sane defaults. The parent profiles have a thorough monitoring and governance features that are dead simple to use.

        As always it's not perfect but it will go a long way. Just getting a majority of parents on sane defaults will help unknot the broader coordination problems.

    • irusensei 58 minutes ago
      I think the way Roblox is doing right now separating the users in age groups just makes it easier for predators to find victim.
    • kevmo 1 hour ago
      I was getting a haircut last week and chatting about our kids with the stylist, who said (basically): "I just started letting my 7 year old on Roblox. I know its full of pedophiles. I told him to come to me or his older brother if anyone tries to talk to him."

      If the million reports of Mark Zuckerberg enabling pedophiles and scam artists haven't made it clear, the executives of these tech companies just don't care. They will sell children into sexual slavery if it improves next quarter's numbers.

    • next_xibalba 1 hour ago
      [flagged]
      • drnick1 43 minutes ago
        Age verification on mainstream porn sites does absolutely zilch against teenagers accessing porn. There are countless other ways of obtaining porn. Even DDG with the safety off will provide plenty of it.
      • pixl97 52 minutes ago
        >it might prevent that

        On the global internet... good luck with that.

        Oh, they'll ban us from looking at other countries net's soon enough for our safety.

      • Barrin92 50 minutes ago
        >and this seems like it might prevent that

        sorry but we're on the internet. You can type the literal words 'hardcore pornography' into any search engine of your choice and find about fifteen million bootleg porn sites hosted on some micro-nation that don't care about your age verification.

        In fact ironically, this will almost certainly drive people to websites that host anything.

      • polski-g 1 hour ago
        What evidence led you to believe this, when controlling for heritability?
        • gjsman-1000 59 minutes ago
          How about that 38% of young women in the UK have experienced asphyxiation; combined with studies showing there is zero safe threshold without brain damage markers in the blood?

          https://www.bbc.com/news/articles/c62zwy0nex0o

          https://www.theguardian.com/society/2025/nov/18/sexually-act...

          https://wecantconsenttothis.uk/blog/2020/12/21/the-horrifyin...

          https://www.nytimes.com/2024/04/12/opinion/choking-teen-sex-...

          https://www.psychologytoday.com/us/blog/consciously-creating...

          https://www.itleftnomarks.com.au/wp-content/uploads/2024/07/...

          Before the widespread adoption of pornography, this rate was near 0%. Now we have literally a significant minority of women with permanent brain damage, induced from widespread pornography, unknown harms long-term, and studies already suggesting increased risk of random stroke decades afterwards.

          • john01dav 51 minutes ago
            > combined with studies showing there is zero safe threshold without brain damage markers in the blood?

            Are you saying that there's zero safe threshold of choking, or for viewing porn?

            (To be clear, choking someone without consent is assault and unacceptable, whether a blood test shows damage or not.)

            • gjsman-1000 50 minutes ago
              A. There is zero safe threshold for choking.

              B. Choking is inherently, obviously, dangerous.

              C. Pornography has caused choking behaviors among youth to go from negligible to over 38%.

              D. Brain damage is measurable in anyone who has been choked.

              E. As such, pornography does, in fact, have blame for encouraging this kind of experimentation.

              F. If "fighting words" and "misinformation" shouldn't be free speech, who is to say pornography does not incite risk, when other things can?

              • terminalshort 3 minutes ago
                Can you explain point A? It seems fundamentally flawed unless there is also brain damage from breath holding, hiking at high altitude, and other normal activities that involve operating at lower oxygen levels.
              • array_key_first 32 minutes ago
                The argument I commonly hear of pornography causing more extreme sexual experimentation is a very weak one. I know, for sure, pornography did not cause me to be a homosexual.

                Kinks, BDSM, and what have you, have always existed and will continue to exist. The solution is teaching safe ways to participate, and the importance of consent. A desire to just wipe them out is naive, and will not work.

              • d1sxeyes 23 minutes ago
                I have a lot of concerns about your presentation of this.

                A. It’s also true that there is no safe level of alcohol consumption and yet we sort of see experimentation with alcohol as a rite of passage.

                B. I mean, so is walking out your front door. I don’t see this as adding much to point A.

                C. This is a big jump. First, we see more openness about sexual behaviour. While I’m prepared to agree that it has likely gone up, I would not be comfortable with the degree you imply. Second, while I do think it is likely that pornography has indeed contributed to this, pornography has also likely contributed to an increase in experimentation in general, with other sexual behaviours also likely seeing an increase (for example oral/anal sex, water play, etc).

                D. I find this very hard to accept at face value. Do you have studies/evidence to support this claim?

                E. Yes, I would likely agree, although whether “encourages sexual experimentation” is a bad thing or not is a question for further debate.

                F. This conflates some very weird things. “Fighting words” are a specific type of restricted speech (i.e. you can’t go round shouting “I’ll kill you”). Sharing misinformation is broadly not illegal (except in very specific sets of circumstances-fraud, inciting violence, etc.). It’s also broadly speaking not against the law to tell the truth. “Some people like to choke each other during sex” is a true statement, even if it’s harmful.

                Do you support a ban on porn all together? That’s quite a radical view.

              • stickfigure 27 minutes ago
                > Pornography has caused choking behaviors among youth to go from negligible to over 38%.

                That which is asserted without evidence, can be dismissed without evidence.

              • gruez 12 minutes ago
                >and "misinformation" shouldn't be free speech

                That worked so well during covid, right?

          • irusensei 53 minutes ago
            I'm trying to find the contact for the does-not-imply-causation dept but I think I lost my slashdot account in 2004.
            • gjsman-1000 53 minutes ago
              Nobody studying this issue, from the UK government to independent researchers to NGOs, says this anymore. PornHub in legal filings never uses this argument, but instead focuses on rights to expression rather than dispute the claim.

              The causation is clear, documented, proven. Increased pornography exposure with dangerous behaviors, causes those dangerous behaviors to be repeated, even when participants are warned of the risk.

              At this point, denial is like saying flat earth has merit.

              • idiotsecant 39 minutes ago
                So what? The problem here would be if these activities are nonconsensual. I've seen no evidence of that. If you're just trying to thought police ideas that lead to people doing risky things you better drop your clutching pearls and pick up a pencil cause that's a long list, some of which are probably things you do.
                • irishcoffee 3 minutes ago
                  The internet in a nutshell: I’m right and if you don’t agree, you’re wrong. Facts need not apply.
          • stickfigure 44 minutes ago
            > Before the widespread adoption of pornography, this rate was near 0%

            Bullshit. Men and women have been dying of autoerotic asphyxiation long before the internet. And we only hear about the ones that fuck up badly enough to make the news.

            I'm puzzled by this phenomenon myself, but there is apparently a significant minority of women who enjoy getting choked in bed:

            https://link.springer.com/article/10.1007/s13178-025-01247-9

            This doesn't excuse people who choke without consent, but there's something going on here waaaay more complex than "see it in porn, do it". Humans are weird.

            • gjsman-1000 42 minutes ago
              Nobody is saying that nobody did this before. We are saying now that it is a health crisis, objectively.

              You're the guy saying that 110 MPH speed limits can't be responsible for crashes because people also died at 20 MPH.

              • stickfigure 29 minutes ago
                You did in fact just say that nobody did it before - or very strongly implied it based on how sloppy you want to be with the phrase "near 0%".

                Stop pretending you know what that number is.

          • dangus 47 minutes ago
            > Before the widespread adoption of pornography, this rate was near 0%.

            Big giant citation needed on that one. How would it ever have been near 0%?

            First, I’d like to point out that we don’t make other media illegal or age gated with privacy-compromising tactics because it depicts harmful things. There’s no age verification gate for watching movies and TV that depict murder and other serious crimes. You can watch Gaston drink beer and fall to his death and the Beast bleed in a kids movie rated G.

            Watching NFL football, boxing, and UFC fighting isn’t illegal even those sports conclusively cause brain damage.

            Pornography is singled out because it’s taboo and for no other reason. People won’t politically defend it because nobody can publicly admit that they like watching it, even though most people consume it.

            Over 90% of men and over 60% of women in the last month. [1]

            Second, what I see missing from your links is really solid studied link to an increase in choking injuries directly caused by changes in pornography trends and viewership. Were these kinks just underreported in the past? Heck, I read 4 of your linked articles and none of them actually compared the rate of choking injury over time, they just sort of pointed it out as something that exists and jumped to blaming pornography.

            I am perfectly willing to accept your hypothesis but I don’t think we’ve been anywhere near scientific enough about evaluating it, and even if that was the case, we don’t really treat pornography the same as other media just like I mentioned.

            Thirdly, from a practical standpoint, the age gates are re-implementing parental controls that already exist. The system doesn’t actually stop anyone who was smart enough to defeat current the parental controls mechanisms. If you can turn off a parental control setting you can enable a free VPN. So at the end of the day all you get is invading the privacy of legitimate users.

            We need a lot more information. Personally, I think there’s nothing wrong with sexual pleasure and believe it’s stigmatized way too much. I also believe that normalizing sexual pleasure helps people talk about consent and avoids issues like doing a sexual act when you don’t enjoy it.

            [1] https://pubmed.ncbi.nlm.nih.gov/30358432/

  • cons0le 2 hours ago
    >If Google can guess your age, you may never even see an age verification screen. Your Google account is typically connected to your YouTube account, so if (like mine) your YouTube account is old enough to vote, you may not need to verify your Google account at all.

    This has been proven false a bunch of times, at least if the 1000s of people complaining online about it are to be believed. My google account is definitely old enough to vote, but I get the verification popup all the time on YouTube.

    I think the truth is, they just want your face. The financial incentive is to get as much data as possible so they can hand it to 3rd parties. I don't believe for a second that these social networks aren't selling both the data and the meta data.

    • xmprt 15 minutes ago
      I think the reality is a lot less nefarious. They don't want your face. But they also don't care enough to not take your face. Why would Google spend lobbying and legal money trying to fight this requirement when it doesn't hurt their bottom line? On the other hand, requirements like storing ID cards does hurt their bottom line because it means:

      1. they need additional security measures to avoid leaking government documents (leaking face photos doesn't hurt them as much) 2. not every person has a valid government document 3. additional customer support staff to verify the age on documents rather than just using some fuzzy machine learning model with "good enough" accuracy.

      The bottom line is that companies are lazy and will do the easiest thing to comply with regulations that don't hurt them.

    • AshamedCaptain 2 hours ago
      My Google account is more than 18 years old and I hit an age prompt when I was trying to watch some FPGA video (out of all things). So no, account age is not necessarily a factor.
      • stonemetal12 1 hour ago
        They probably need to account for parents allowing kids to use their account, so account age can be a factor but not an automatic pass.
      • dlcarrier 1 hour ago
        Field programmable gatorade is an adult-only beverage.
      • inopinatus 1 hour ago
        That makes sense. Golf has a minimum age of 35.
        • pixl97 50 minutes ago
          Did you hear they are letting kids play pickleball these days! How scandalous.
      • RobotToaster 1 hour ago
        Can't allow any underage synthesis.
      • raverbashing 1 hour ago
        Yeah, they could/*should* infer your age just by the fact you're watching an FPGA video
        • bluGill 52 minutes ago
          I would have watched those at 10 if the internet was a thing when I was 13. I think most people here would have. (I may or may not have understood it, but I would have tried)
    • blacksmith_tb 1 hour ago
      I agree they want the face data, but I think it's less clear they want to "hand it" (presumably that's really "sell it"?) to third parties. My sense is Google and Apple and Meta are amassing data for their own uses, but I haven't gotten the impression they're very interested in sharing it?
      • llbbdd 1 hour ago
        Sharing it is bad for business; selling insights derived from it for ad placement is the game. Faces definitely contain some useful information for that purpose.
      • testing22321 1 hour ago
        They’ll do whatever makes money.

        Sell it and use it internally.

      • 121789 1 hour ago
        you are correct. having that data is one of their competitive advantages, it makes no sense to sell it. they will collect as much as possible and monetize it through better ads, but they don't sell it
    • zahlman 1 hour ago
      I haven't gotten it yet on my account from 2006. Maybe it matters whether it's a brand account? Maybe it matters whether the accounts actually are connected?
      • mythrwy 1 hour ago
        well as long as it's you logging in, they know you are minimum 20 years old!
    • jama211 1 hour ago
      They definitely already have your face though…
      • ambicapter 1 hour ago
        The more examples in various situations they can get, the higher their accuracy.
      • zahlman 1 hour ago
        From where? Not everyone even puts selfies on the Internet.
        • pixl97 48 minutes ago
          Honestly, it's probably already happening, but I would not be surprised if retail stores that check your ID also have cameras snaping your face and selling that to data brokers.

          Anything you can image that is bad with privacy, figure what is occurring is far worse.

    • SilasX 1 hour ago
      I wrote an April Fool's parody in 2021 that Google is going to get rid of authentication because they're following you around enough to know who you are anyway (modeling it after their No Captcha announcement[1]):

      http://blog.tyrannyofthemouse.com/2021/04/leaked-google-init...

      Edit:

      >I think the truth is, they just want your face.

      I just realized the parody also predicted that part (emphasis added):

      >>In cases where our tracking cookies and other behavioral metrics can't confidently predict who someone is, we will prompt the user for additional information, increasing the number of security checkpoints to confirm who the user really is. For example, you might need to turn on your webcam or upload your operating system's recent logs to give a fuller picture.

      [1] https://security.googleblog.com/2014/12/are-you-robot-introd...

    • gosub100 1 hour ago
      I just got glasses yesterday and the optician needed to take a pic of my face to "make sure my glasses fit". The first thing I thought of was they are probably selling the data.
      • rolph 53 minutes ago
        just say no thank you, i will manage like everyone else has for decades.

        else you and your money go elsewhere.

    • shevy-java 1 hour ago
      > I think the truth is, they just want your face.

      Agreed. They treat people as data points and cash cows. This is also one reason why I think Google needs to be disbanded completely. And the laws need to be returned back to The People; right now Trump is just the ultimate Mr. Corporation guy ever. Lo and behold, ICE reminds us of a certain merc-like group in a world war (and remember what Mussolini said about fascism: "Fascism should more appropriately be called Corporatism because it is a merger of state and corporate power." - of course in italian, but I don't know the italian sentence, only the english translation)

  • dakiol 1 hour ago
    I’ve noticed that many people struggle to simply let things go. Take a hypothetical case where HN requires ID verification. I'd just stop using HN, even if that meant giving up checking tech news. Sometimes things end, and that's fine.

    I used to watch good soccer matches on public TV. When services like DAZN appeared, only one major match was available each weekend on public TV. Later, none were free to watch unless you subscribed to a private channel. I didn't want to do that, so I stopped watching soccer. Now I only follow big tournaments like the World cup, which still air on public TV (once every 4 years).

    Sometimes you just have to let things go

    • mystifyingpoi 50 minutes ago
      > I’ve noticed that many people struggle to simply let things go

      Because it's not always about their entertainment. I know churches that post info about events only on WhatsApp groups, if you don't use it - you're screwed. I know kindergardens which use Facebook Messenger groups to send announcements to their parents' children - if you don't use it, you will miss important info.

      For most people, letting go such things is very impractical. One can try to persuade for a better way to do something - but then you become the problem.

      • array_key_first 29 minutes ago
        People need to be more comfortable being the problem more often. Even if people actually use these solutions, they're almost always suboptimal anyway. We shouldn't be relying on them the way we do.
        • xmprt 12 minutes ago
          Or to flip it on its head, be the solution. If a church or some other activity is requiring Whatsapp, then come up with a better alternative that does more than Whatsapp ever could.
    • layer8 14 minutes ago
      Many people don’t struggle to let privacy go.
    • zackmorris 57 minutes ago
      Funny, I'm the opposite. Since information wants to be free, and storage/compute get more affordable every year, then really everything ever posted on the web should be mirrored somewhere, like Neocities.

      I grew up in the 80s when office software and desktop publishing were popular. Arguably MS Access, FileMaker and HyperCard were more advanced in some ways than anything today. There was a feeling of self-reliance before the internet that seems to have been lost. To me, there appears to be very little actual logic in most websites, apps and even games. They're all about surveillance capitalism now.

      Now that AI is here, I hope that hobbyists begin openly copying websites and apps. All of them. Use them as templates and to automate building integration tests. Whatever ranking algorithm that HN uses, or at least the part(s) they haven't disclosed, should be straightforward to reverse engineer from the data.

      That plants a little seed in the back of every oligopoly's psyche that ensh@ttification is no longer an option.

  • JoshTriplett 2 hours ago
    I'm surprised that the EFF does not highlight the best option, here: use a VPN to a jurisdiction that doesn't have such ridiculous laws.
    • kristjank 2 hours ago
      It might be bad for an activist group to advocate just ignoring the problem into a different jurisdiction.
      • paulddraper 1 hour ago
        They could sell it as "if your IP geolocation is inaccurate, or if the statute does not apply to you."

        But FWIW VPNs can get flagged for suspicious behavior. YMMV

    • hamdingers 1 hour ago
      "Give up" is not the best option. Certainly not from the EFF's perspective.
      • JoshTriplett 26 minutes ago
        I mean, the best option is to fight this legislation, and AIUI they're doing that too. But this article is not about that, it's about how to minimize the harm if you encounter it.
    • Retr0id 1 hour ago
      In many cases, using a VPN is a great way to get your account flagged as suspicious.
    • cedws 1 hour ago
      The days are numbered on this technique working. After enough countries enact their own age verification laws tech companies will just make that the global default policy, and I'm sure the opportunity to harvest user data will not be left to waste. Many sites already block and throttle VPNs.

      When that day comes I'll stop casually using the internet or search for the underground alternative.

    • omoikane 1 hour ago
      I think EFF does not recommend for or against VPN in general because it's not always a clear win, depending on the VPN and the use case.

      https://ssd.eff.org/module/choosing-vpn-thats-right-you

    • SoftTalker 1 hour ago
      Next step: the same government that is demanding the age verification will ban VPNs.
      • JoshTriplett 1 hour ago
        Not especially feasible if you want to support businesses. More likely is trying to demand that VPNs also enforce age verification, which business-targeted VPNs might do, and then ban the ones that don't.
      • pc86 1 hour ago
        Everyone seems to forget that using VPNs to violate your local laws gives lots of good ammo to the authoritarians that want to ban VPNs. The answer isn't to use a VPN to get around it (and thus give fodder to your enemies) but to change the law.
        • luke727 1 hour ago
          While I agree with this in spirit, here in the UK both major parties along with the public at large generally support these types of laws.
          • JoshTriplett 24 minutes ago
            Two of the major parties support it, but it's not entirely obvious how much public support there is; it's not most people's top issue, and it's easy to make polls say what you want depending on the question you ask.

            You'd get different answers if, for instance, you ask "do you want to have to show ID or submit a picture of your face in order to access many sites on the Internet".

      • Jigsy 54 minutes ago
        I doubt this would be workable.

        They could, sadly, however, make it a crime to bypass things like The Online Safety Bill. Downloading or using Tor, for example.

        At that point, the only sane option is to become a criminal.

  • marssaxman 2 hours ago
    I have never clicked "accept" on a cookie banner, as a matter of principle; I zap them away with uBlock Origin. Should the plague of age verification reach my jurisdiction, I'm sure I will handle it in like fashion.
    • RankingMember 1 hour ago
      Zapping only works if the site lets you continue/pull content without verification.
      • marssaxman 1 hour ago
        I expect I'll need to employ some other technical means of circumvention, but the principle of refusing to engage with the thing on its own terms will remain the same.
        • kube-system 1 hour ago
          These things are integrated into the authentication systems of these services. They aren't implemented client side. Refusing to engage with them means you cannot use the service.
          • BanAntiVaxxers 56 minutes ago
            Then it wasn't meant to be. Let it go.
            • pixl97 47 minutes ago
              Fun and games until your government makes getting access to the internet at all work that way.
            • RankingMember 45 minutes ago
              The problem there is when it's inescapable, on every site.
    • antonvs 1 hour ago
      The difference is that the cookie banner is not a gate. uBlock Origin is unlikely to be able to satisfy a website about your age without submitting the info that the site expects. (Assuming the age check has any teeth at all.) You're unlikely to be able to continue as usual if these kinds of measures become ubiquitous.
    • goopypoop 1 hour ago
      ignoring the banner is the same as agreeing to all the opt-out "legitimate interest" shit
  • firefoxd 1 hour ago
    My main concern is that there isn't a reliable way to know your information is securely stored[0].

    > A few years ago, I received a letter in the mail addressed to my then-toddler. It was from a company I had never heard of. Apparently, there had been a breach and some customer information had been stolen. They offered a year of credit monitoring and other services. I had to read through every single word in that barrage of text to find out that this was a subcontractor with the hospital where my kids were born. So my kid's information was stolen before he could talk. Interestingly, they didn't send any letter about his twin brother. I'm pretty sure his name was right there next to his brother's in the database.

    > Here was a company that I had no interaction with, that I had never done business with, that somehow managed to lose our private information to criminals. That's the problem with online identity. If I upload my ID online for verification, it has to go through the wires. Once it reaches someone else's server, I can never get it back, and I have no control over what they do with it.

    All those parties are copying and transferring your information, and it's only a matter of time before it leaks.

    [0]: https://idiallo.com/blog/your-id-online-and-offline

    • pixl97 42 minutes ago
      Honestly that main concern should be two main concerns.

      You/your kid/your wife goes to hàckernews.com and is prompted for age verification again, evidently the other information has expired based on the message. So they submit their details. Oops, that was typosquatting and now who the hell knows has your information. Good luck.

  • cloudfudge 1 hour ago
    This makes me wonder if there's a business case for a privacy-preserving identity service which does age verification. Say you have a strong identity provider that you have proven your age to. Just as the 3rd party site could use SSO login from your identity provider, perhaps the identity provider could provide signed evidence to the 3rd party site that asserts "I have verified that this person is age X" but not divulge their identity. Sidestep the privacy issue and just give the 3rd party site what they need to shield them from liability.
    • triceratops 31 minutes ago
      Yes. In fact the 3rd party doesn't even need to know who you are.

      https://news.ycombinator.com/item?id=46447282

      • cloudfudge 16 minutes ago
        That's quite an elaborate system. It goes through a lot of gyrations (not the least of which is inventing a whole new type of crime and passing laws about it) and doesn't sound even as strong as the age verification "required" to buy cigarettes in the US. I'd think "welcome to pornhub. Either log in or do Privacy-enhanced Age Verification by Auth0 (TM)" would be a lot easier to get off the ground.
    • MiddleEndian 27 minutes ago
      I'm more interested in a business that reliably provides fraudulent IDs to services that unnecessarily want IDs that I cannot avoid for some reason.
    • enahs-sf 54 minutes ago
      I’ve been noodling on this idea for a while but I think getting commercial acceptance would be hard. People have tried it with crypto albeit with lukewarm results. I think to have the network effects required to be successful in such an endeavor, it would have to come from a vendor like apple or google unfortunately.

      You kind of want an mTLS for the masses with a chain of trust that makes sense.

    • awkward 1 hour ago
      The article does go into this and gives lip service to the idea that a secure third party could expose age without exposing identity. Ultimately, there's still the problem that even if point of verification can be done in a zero trust way, you are still entrusting very sensitive information to a third party which is subject to data breach.
    • dakiol 1 hour ago
      The question is: why would services like Google and others want to use such privacy-preserving identity solutions? They wouldn't gain anything from a non-invasive, user-friendly system, so I don't think they'd use it. They want more data, so they are going for it.
      • cloudfudge 1 hour ago
        I was thinking someone like Auth0 might want to offer it. They are not in the business of invasive user tracking but are in the business of trust.
    • izacus 54 minutes ago
      This is how Swiss e-ID was proposed to work: https://www.eid.admin.ch/en
  • drnick1 33 minutes ago
    If this is about porn or other content deemed age-sensitive, the moment it becomes difficult to source through "official," mainstream platforms, the content will move underground (P2P networks), making it even more difficult to analyze and regulate. So this is a very shortsighted move.
  • bloppe 6 minutes ago
    Estonia basically got this completely right in 2002 with their e-ID. I'm kinda shocked nobody else has figured it out yet. Age verification could be simple, secure, robust, and require only the disclosure of your age, nothing more.

    Instead, the rest of us have systems that are both far more vulnerable to privacy beaches, and far easier to circumvent anyway.

  • torcete 1 hour ago
    I thought the article was about finding a job when you reach a certain age, which is my problem.
  • numpad0 39 minutes ago
    Isn't age guesstimation by appearance, even with advanced machine learning techniques, even if attempted by real person with honest effort, just total snake oil? This ongoing age verification push with weird emphasis on generating name-face pairs is beyond fishy.
  • neilv 19 minutes ago
    > At some point, you may have been faced with the decision yourself: should I continue to use this service if I have to verify my age?

    An excellent question, which I didn't see the article really get into.

    > If you’re given the option of selecting a verification method and are deciding which to use, we recommend considering the following questions for each process allowed by each vendor:

    Their criteria implies a lot of understanding on the part of the user -- regarding how modern Web systems work, widespread industry practices and motivations, how 'privacy policies' are often exceeded and assurances are often not satisfied, how much "audits" should be trusted, etc.

    I'd like to see advice that starts by communicating that the information will almost certainly be leaked and abused, in n different ways, and goes from there.

    > But unless your threat model includes being specifically targeted by a state actor or Private ID, that’s unlikely to be something you need to worry about.

    For the US, this was better advice pre-2025, before the guy who did salutes from the capitol was also an AI bro who then went around hoovering up data from all over government. Followed by a new veritable army and camps being created for domestic action. Paired with a posture from the top that's calling harmless ordinary citizens "terrorists", and taking quite a lot of liberties with power.

    We'll see how that plays out, but giving the old threat model advice, without qualification, might be doing a disservice.

  • aleksandrm 47 minutes ago
    OpenAI uses AI to scan your ChatGPT conversations to determine your age. And even though I've been using ChatGPT for mostly work-related stuff, it has identified me, a man in my 40s, as under 18 and demanded government ID to prove my age. No thank you.
  • izzydata 1 hour ago
    If my options are upload a picture of myself for Google to monetize through ads or not use Google / Youtube then I will be moving on regardless of the inconvenience to myself.
  • tracker1 20 minutes ago
    I'm honestly a bit mixed on this... I don't think that (especially young) children should have access to explicit, graphic sexual content, especially kink. If you as a parent want your kids to have access, so be it... but then the onus should be on the parent.

    On similar lines, I think that something between an unrestricted smart phone and the classic dumb phone is a market segment that is needed.

  • Retr0id 2 hours ago
    There were some amusing headlines a while back about Discord's verification being fooled with game screenshots. Does anyone know if that's still the case?
    • everyday7732 1 hour ago
      saw a recent screenshot of someone doing it yesterday, so I think it still is a thing.
  • shevy-java 1 hour ago
    States need to stop sniffing for age really. This is age discrimination.
    • kube-system 1 hour ago
      Basically every government on the planet has laws that apply specifically to children. The term "age discrimination" typically refers to disadvantaging someone for being of old age.
  • drnick1 1 hour ago
    Switch VPN region or upload a random picture generated by AI, problem solved.
  • dlcarrier 1 hour ago
    How well does the selfie test detect AI-generated photos? That seems easy to bypass, especially if you copy the metadata over from a real photo.
    • kube-system 1 hour ago
      The ones I have used do not accept photos, they require real-time video with the front-facing camera and they prompt you to move your head to face different directions on command. Not impossible to attack, I'm certain, but it's tougher than simply uploading a photo.
      • pzo 52 minutes ago
        on desktops you can have virtual camera, if you can generate video fast enough wen AI you can ask to edit it according to instructions. Definitely tougher but I'm sure someone will offer services or software like that.
  • irusensei 2 hours ago
    Face scan: download and install Gary's mod.
  • deadbabe 21 minutes ago
    It is very easy to lie about age through age gates. I have yet to find one that is actually able to get strong proof of age, fake IDs are easy to upload.
  • miki123211 1 hour ago
    > Even though there’s no way to implement mandated age gates in a way that fully protects speech and privacy rights

    I think the EFF would have more success spreading their message if they didn't outright lie in their blog posts. While cryptographic digital ID schemes have their problems (which they address below), they do fully protect privacy rights. So do extremely simple systems like selling age-verification scratchcards in grocery stores, with the same age restrictions as cigarettes or alcohol.

    • autoexec 32 minutes ago
      > So do extremely simple systems like selling age-verification scratchcards in grocery stores

      Which stores sell age-verification scratchcards? How do you make sure they can't be traced back to the person who paid for them or where they were purchased from? How would a website know the person using the card is the same person who paid for them? It may be a simple system, but it still sounds ineffective, dangerous, and unnecessary.

      • triceratops 28 minutes ago
        > Which stores sell age-verification scratchcards?

        Stores that sell other age-restricted products.

        > How do you make sure they can't be traced back to the person who paid for them

        How would they be traced? Pay cash. I've never had my ID scanned or recorded when I buy alcohol. And now I look old enough that I don't even have to show ID.

        If someone can trace the store they're bought from and you're that paranoid, rotate between stores. Buy them from a third-party. Drive to another state and buy them there. So many options.

        > How would a website know the person using the card is the same person who paid for them?

        They don't. How does Philip Morris know the person who bought the cigarettes is the same person lighting up? It's clearly not that important when selling actual poisons so why would it matter for accessing a website? The system works well enough to keep most kids from smoking.

        Rate-limit sales in a store (one per visit) and outlaw selling or transferring them to a minor (same penalties as giving alcohol or tobacco to a child). Require websites to implement one code per account policies with a code TTL of 6 months or a year, and identify and disallow account sharing. It's Good Enough verification with nearly perfect anonymity.

        • autoexec 5 minutes ago
          > Stores that sell other age-restricted products.

          So far, I've never seen an age verification scratch card sold anywhere

          > How would they be traced?

          Your ID is collected at retail and its barcode scanned along with a barcode on the card, your personal data and card ID get uploaded to a server operated by the entity that created the cards and/or the state. ID barcode scan can be replaced or used alongside facial recognition, pulling data from your cell phone, your credit card info, etc. Even just being able to link a used card back to the time/place it was purchased could be enough to ID someone and put them at risk.

          > It's clearly not important when selling actual poisons so why would it matter for social media?

          The main difference is that I can't upload 1 million cigarettes to the internet for anyone of any age to anonymously download and use, but I could upload a spreadsheet of 1 million unredeemed scratch off codes to the internet for anyone to use. It seems highly likely that codes would get sold, shared online, generated, or leaked which means cards would be ineffective at keeping children from using them.

          Why should we be okay with jumping through a bunch of hoops that don't even do what they're supposed to in the first place while costing us money and opening ourselves up to new risks in the process? I reject the premise that proving my identity to a website is necessary let alone being worth the costs/risks. Scratch cards seem likely to fail at being private or effective. Of course, "Think of the children" is really only the excuse. Surveillance and control is the real motivation and any system that doesn't meet that goal is doomed to be replaced by one that does.

  • jmclnx 1 hour ago
    >should I continue to use this service if I have to verify my age?

    Simple answer, never accept this If everyone selected "cancel" you can be sure these sites will stop age banning, they wan $ more than anything else.

    If a site asks me one question about me, I stop using if.

  • jimbob45 2 hours ago
    Is there a throwaway identity that people are using? A dead person unchecked in Mississippi somewhere? Like every teen in America using the same identity like everyone's extended family does with their uncle's Netflix account?

    I don't want to google it because I don't want to be put on a list but I also feel somewhat confident that this is being done. Apparently, HN feels safe to ask questions like that for me.

    • glitcher 1 hour ago
      > I don't want to google it because I don't want to be put on a list

      Of all the controversial things out there we've become afraid to even google in order to learn more about the world around us, this one strikes me as not all that controversial.

      But you're not wrong, just making a comment about how sad the world has become.

    • bee_rider 1 hour ago
      That’s an interesting question.

      Actually, a follow up. PII leaks are so common, I guess there must be millions of identities out there up for grabs. This makes me wonder: we’ve got various jurisdictions where sites are legally required to verify the age of users. And everybody (including the people running these sites) knows that tons of identities are out there on the internet waiting to be used.

      How does a site do due diligence in this context? I guess just asking for a scan of somebody’s easily fabricated ID shouldn’t be sufficient legal cover…

      • kube-system 1 hour ago
        These ID laws typically require a solution to be "commercially practical" or similar. The standard is not "impenetrable and impossible to circumvent"

        That's why some of them don't even ask for ID but just guess the age based on appearance. That's good enough per the law, usually.

    • everyday7732 1 hour ago
      It would probably flag that multiple people are using the same photo or same persons name/ id, but I expect you could get away with doing using someone known to you. iirc the reason people are using game screenshots is because it's not going to match any image that the recogniser has seen before. Use tor for the things you don't want to google and have associated with you.
    • acka 1 hour ago
      Netflix has been checking accounts against public IP addresses and local networks for ages, at least in The Netherlands. if I use my Dad's account, I get flagged as being "not on the same home network" immediately. I think that using a VPN and Netflix detecting that would only make matters worse, like termination of service.
      • reincarnate0x14 1 hour ago
        I gave up on netflix years ago for unrelated reasons but never had any sort of issue both VPNing between various countries and traveling between them. My wife would pretty regularly want to watch netflix as if she was in Japan or the UK and so we'd turn a VPN on for the TV network and their own TV app never complained at all that it was suddenly on a different continent.
    • shiandow 1 hour ago
      Last time I tried I could find a photo ID just with a basic image search. It is an unavoidable consequence of teaching people that scanning an ID is not utterly insane.

      Ironically there was no way to report the image anonymously to the service hosting it.

  • AndyMcConachie 1 hour ago
    Why can't the EFF tell people to lie? Because if you can get away with it, lying is almost always your best option. Unless there are actual real world consequences to lying like you may anger the police.

    And maybe consider using a VPN.

    • kube-system 1 hour ago
      I'd imagine it is because several of the obvious options for "lying" here may violate criminal law. And also because the EFF is an civil liberties advocacy group, they want to change the law, not circumvent it.
    • HotGarbage 1 hour ago
      For real. This should be an article about circumvention, not compliance.
      • nottorp 1 hour ago
        That's not EFFs job, just ask your kids how they circumvent age gates for that :)
  • maximgeorge 1 hour ago
    [dead]
  • iLoveOncall 1 hour ago
    What a piss poor article.

    "We disagree with age gates but our recommendation is to comply". Fuck this.

  • mlinster 1 hour ago
    I think that age verification is important. While its not perfect, it is one tool to help protect kids.
    • MiddleEndian 26 minutes ago
      I would say that normalizing giving random websites photos of yourself is harmful to children.
    • unglaublich 1 hour ago
      Against what? How much struggle and pain are we actually seeing in the world because children have unrestricted internet access?
    • t-3 40 minutes ago
      Think back to when you were a child. Did age verification ever stop you from doing anything? The automated, technologically-implemented age-verification is even less interested in properly verifying anything than the ID-checking bouncers at a bar. None of these things protect kids, they just annoy them and teach them that authority is stupid and lying is a convenient way to deal with stupid people.
    • anthk 1 hour ago
      Call your ISP and ban any NSFW/NSFL access by DNS, both in your children's phones and your home connection. Problem solved.
      • drnick1 53 minutes ago
        This does not work, browsers like Firefox don't even always use the system DNS by default.
        • pixl97 39 minutes ago
          Ah, blocking porn from your devices does not work. But age gating porn in your country somehow fixes the fucking global internet....

          Please explain that too me.

          I'm sorry for getting a little steamed here, but I have to wonder if you've put any thought into what you're asking for in the name of kids safety. And worse, if you think it will work globally what are you going to do when Saudi Arabia wants anything they don't like banned in the US, for example.