He / They

  • 9 Posts
  • 862 Comments
Joined 3 years ago
cake
Cake day: June 16th, 2023

help-circle


  • I’m not sure if you’re being sarcastic, but given that this is an anti-firearms bill, they will probably do the same thing they do when you purchase a firearm magazine cross-state; they’ll open the box and check that it is ‘compliant’ with the 10-round limit (or in this case, has compliant firmware). If it is, they’ll ship it on to you. If it’s not, they’ll ship you the empty box with a notice of seizure. You may also be contacted by CADOJ later, depending how much free time they have.











  • The real means to prevent this is unionizing, which is really the answer to most other techbro-hellscape problems too. Just like Hollywood is putting anti-ai clauses in their contracts, so too will tech workers need to. Unfortunately, given that the end goal is to remove the IT workers entirely, this is still only a delay if companies push ahead, since just like scabs, there will always be people willing to sell their fellow workers down the river for their own enrichment.

    But we’re not even close to that point; most tech workers think unionizing is a 4-letter word. There’s always a private chat room where folks are lamenting the absolute class-ignorance of their coworkers who are all convinced they’re going to stumble into unicorn stock options soon, despite multiple rounds of layoffs each year now being standard in tech.

    The real question is what has to happen to end this horrible capitalist nightmare in general.



  • No, this distinction prevents publishers from co-opting “indie” as a label, which people support because of that artistic discretion, and hiding it behind their opaque promises of such independence that no one can verify. You cannot trust a dev hasn’t been influenced by a publisher when they’re present, so the only way to ensure that is to not have a publisher present.

    I don’t know that movie, but I do know actual indie devs who use e.g. Patreon for funding. It’s not about not having money, it’s about who your money comes from, and whether there can be hidden stipulations on it. With publishers, there always are.






  • immoral people existing is not the problem here

    True. The profit motive is. People pushing harmful content are doing it because it makes them money, not because they’re twirling their moustaches as they relish their evil deeds. You remove the profit motive, you remove the motivation to harm people for profit.

    the difference is that there isn’t an algorithm that acts as a vector for harmful bullshit

    The algorithms boost engagement according to 1) what people engage with, and 2) what companies assess to be appealing. Facebook took the lead in having the social media platform own the engagement algorithms, but the companies and people pushing the content can and do also have their own algorithmic targeting. Just as Joe Camel existed before social media and still got to kids (and not just on TV), harmful actors will find and join discords. All that Facebook and Twitter did was handle the targeting for them, but it’s not like the targeting doesn’t exist without the platforms’ assistance.

    Said bad actors do not exist in anywhere near the same capacity. Imo the harm of public chat rooms falls under the “parents can handle this” umbrella. Public rooms are still an issue, but from experience being a tween/teen on those platforms, it’s not even close to being as bad.

    It wasn’t as bad on those… back when we were teens. It absolutely is now. If anything, you’ll usually find that a lot of the most harmful groups (red-pill/ manosphere, body-image- especially based around inducing EDs- influencers) actually operate their own discords that they steer/ capture kids into. They make contact elsewhere, then get them into a more insular space where they can be more extreme and forceful in pushing their products, out of public view.

    If it was the case that it was just individual actors on the platform causing the harm and not the structure of the platforms incentivizing said harm, then we would see more of this type of thing in real life as well.

    I’m not saying it’s all individuals, I’m saying the opposite; it’s companies. Just not social media companies. Social media companies are the convenient access vector for the companies actually selling and pushing the harmful products and corollary ideas that drive kids to them.

    I struggle to think of a more complete solution to the harm caused by social media to children than just banning them.

    Given that your immediate solution was to regulate kids instead of regulating companies, I don’t think you’re going to be interested in my solutions.