• deweydecibel@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      3
      ·
      edit-2
      1 year ago

      It’s software that also serves as a method to distribute and access it. But ultimately, it doesn’t matter, the resulting pushback will be the same.

      The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

      Wherever this shit exists, unwanted attention and scrutiny will follow, while the reputation of the platform will be harmed.

      • CookieJarObserver@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        3
        ·
        1 year ago

        This will literally do nothing, the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work…

        And the big Players will be thrown out by everyone else, we are here because we hate them.

        • Kayn@dormi.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work…

          How do you know that?

          • CookieJarObserver@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            16
            ·
            1 year ago

            Study:

            Mastodon = Server hoster

            Mastodon = Responsible for content

            Mastodon = able to moderate

            Reality:

            Mastodon ≠ Server hoster

            Mastodon ≠ Responsible for content

            Mastodon ≠ able to moderate

            • Kayn@dormi.zone
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              Where does the study suggest the Mastodon software’s ability to moderate, or that it’s responsible for content?

                • Kayn@dormi.zone
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  edit-2
                  1 year ago

                  Did you read the actual study that this article refers to?

                  Going by your lack of further response, I’m going to assume you didn’t, otherwise you’d have noticed that you’re wrong. I recommend reading the sources of articles in the future before commenting on them.

      • recursive_recursion [they/them]@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

        Not sure if that would work as users are fleeing from those big players as they don’t prioritize the safety and needs of their users.

        The contradictory problem is that current major corporations prioritize money at all costs even at the expense of their users so their customer base flee to the next best service/product provider.

        People are currently abandoning Reddit and Twitter because their moderation system either doesn’t work or has underlying contradictions to what users are asking for.

        Facebook launched Threads and people only joined initially due to FOMO. With how transparent they are in harvesting user data at the expense of people’s privacy I think (and hope) that people are starting to realize that this is probably not in their best interests.

        I think what we’re seeing is evolutionary filtration of the web similar to natural ecosystems where the species with the highest ability to adapt that survives.

        Based off of one metric it seems that companies structured around proprietary software (zero-sum systems) are unsustainable. This is my untested observation however so this could be true currently but systemically wrong once examined and tested.

        So the idea that

        biggest players should enter the fediverse in order to use their capabilities to scan and police it.

        doesn’t seem to make the most logical sense as the foundation for those companies is untrustworthy and unsustainable.

  • deweydecibel@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    8
    ·
    1 year ago

    And here we go.

    This will be one of the Fediverse’s biggest obstacles.

    Need to get this under control somehow or else in a few years, tech companies, banks, and regulators will decide a crackdown on the fediverse as a whole is needed.

      • TheNotorious7113@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Would some sort of loosely organized group of instance admins help to make this happen? Like the U.N. for the fediverse? Sounds like a structured communication system would fix this.

      • Kayn@dormi.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        10
        ·
        1 year ago

        I’m afraid a blocklist won’t be enough. As anyone can just spin up an instance or move their existing one to a domain that isn’t in the blocklist yet, a centralized whitelist will be the safer solution.

    • cerevant@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 year ago

      The fediverse is the name for services that use ActivityPub - a communication protocol. What you are saying is like saying “tech companies, banks and regulators need to crack down on http because there is CSAM on the web”.

    • weedazz@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 year ago

      A few years? I bet Threads is doing this right now to shut down every private instance and take the fediverse for themselves. They’ll argue they are the only one that can moderate the content due to their size/resources

      • LinkOpensChest.wav@lemmy.one
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I think they’re speaking from the point-of-view of an uneducated body of legislators and average people who will not understand this

        It doesn’t matter what we know the nature of the fediverse to be – it matters how they perceive it, and uninformed people are perfect targets for this type of FUD

        For example, the linked article exists

  • mbelcher@kbin.social
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    Far-right instances Gab and TruthSocial are also technically mastodon. By this metric mastodon also has a nazi problem.

    Any software that allows people to communicate over the internet will be used by horrible people to do horrible things.

  • demonsword@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    1 year ago

    100% of all child abusers have drank water in the last two days. Clearly water is the problem here.

  • Metal Zealot @lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    That’s like child molesters texting each other, and then saying “TELUS AND BELL ARE PERPETUATING A CHILD SEX TRAFFICKING RING”

  • whenigrowup356@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?

    • ozymandias117@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Those databases are highly regulated, as they are, themselves CSAM

      Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all

  • woefkardoes@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Somewhere we went from finding those who do wrong and punishing them to censor everything because everyone is bad.

  • MyOpinion@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    The Apache foundation has got a huge child sex problem. They must be policed by Microsoft. /s