• 1 Post
  • 28 Comments
Joined 3 days ago
cake
Cake day: March 18th, 2026

help-circle


  • I didn’t understand your disagreement. Yes just like a bar shouldn’t be responsible for a person that gets plastered drunk after they leave, Facebook shouldn’t be responsible for the actions of a predator that goes to a porn website to lure kids. Just like the Catholic Church shouldn’t be responsible for a public school teacher that rapes her students at school. The only times any of these organizations are responsible is when the abuses happen while using their services.

    I don’t get why this is controversial.

    I can’t speak for the military’s recruiting practices. Yes, I fully agree that the military’s recruitment practices are very predatory, and should be reigned in. Politically, I personally think “enlistment” shouldn’t be an option at all. It should be random draft. Every year the military should tell Congress how many new recuits they need, and Congress should approve a draft of 18 year olds for that many new recuits. The draft should be random, with no deferments or other ways out of service other than health reasons as determined by a military physician. (But that’s way off topic.)


  • The problem the predators would have if they are relegated to the “kid friendly” sectors is that those sectors are much better policed by users and the corporations.

    It’s not really the public content that is the problem, the problems really come when a predator can lure a child into a private chat. That’s when the predator can start their process of grooming that eventually leads to blackmailing the child (grooming is a process and it’s damn evil and damn sinister). By relegating the users to “kid friendly” areas, the opportunity to pull kids into private spaces is greatly diminished.

    Now, will the predators stop being predators? No. But if the platforms have strong child protection policies that make it more difficult for the predators, then they will move on to a website that has weaker policies. Which is just about the best an organization or platform can do, make the predators uncomfortable enough that they go hunt someone else’s kids.


  • Correct. Right now the OS maker is not responsible. That exactly why Meta is pushing so hard to change the laws to make them responsible.

    Your analogy is a good analogy. In your car analogy, today, no one blames the car manufacturer for a drunk driver, but we do blame bars and bar tenders. In many states, bars have to be licensed and if the bar tender allows some one to get drunk and drive home the bar and the bar tender can be held liable. This situation would be like if bars got together to lobby state and national governments to make it so that the car manufacturers had to install breathalyzers in every car so that the bars could reduce their liability and responsibility.
















  • It reeks of a coordinated agenda,

    It is a coordinated agenda, just not a secret one like people want to think. It’s being pushed by Meta and a string of popular app makers and games to avoid having to be responsible for their own platforms.

    Therefore, some Fediverse instances, may end up implementing age checking, or stopping altogether if they can’t afford the additional costs of age checking.

    That’s a strange argument to me. That’s exactly what Meta is intending to prevent from having to do by pushing these laws. If countries and states pass laws like the California law specifically, then no fediverse instance will need to worry about age verification. They just ask the user’s browser to ask the OS. California’s version of the law would really help small businesses and small developers, because it puts all the child protection responsibility onto the OS.

    Now, regarding the “kid friendly” limitation: if the Web gets limited to “non-adult content”… what’s “adult content” to begin with?

    In this case, “kid friendly content” becomes “any content that the website wants to be responsible and liable for letting users that report being <18 have access to”.