This is my first try at creating a map of lemmy. I based it on the overlap of commentors that visited certain communities.

I only used communities that were on the top 35 active instances for the past month and limited the comments to go back to a maximum of August 1 2024 (sometimes shorter if I got an invalid response.)

I scaled it so it was based on percentage of comments made by a commentor in that community.

Here is the code for the crawler and data that was used to make the map:

https://codeberg.org/danterious/Lemmy_map

    • Danterious@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Either the people in !steamdeck@lemmy.world are pretty horny or its an artifact of the dimensionality reduction and means nothing.

      Edit: Actually it could also be that it just didn’t collect enough data on that community and the most recent person was also active in nsfw communities. I was only able to get back 14ish days in the data for lemmy.world. They produce way to many comments and I got kicked out early.

      Anti Commercial-AI license (CC BY-NC-SA 4.0)

    • cron@feddit.org
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 months ago

      This community has only two posts and a few comments. The algorithm has very few information on such tiny communities.

      It would probably be useful to only include communities with a minimum amount of interaction to avoid such outliers.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    2 months ago

    Would you be able to take a screenshot of the map and edit that in as the link URL? Nice thumbnails help a post be seen, and it might let people see the map when the site is getting a hug of death 😄

    Then just have the website link at the top of the post

    edit: It loaded for me, and I see why a screenshot wouldn’t make sense. There’s so much cool detail, thanks for sharing!

  • Spot@startrek.website
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    This is really awesome! I saw your post the other day about it and thought it was a great idea. You work quick! I already found a new community I would not have thought to look for otherwise.

    It is hard for me to see and manipulate on mobile, but that’s totally on me. So I’ll be back in a bit. I’m sure someone smarter than me may have more helpful input than that if you are looking for feedback!

    So I thought I was gonna head to bed, but… guess I can stay up and peruse for a little while…

    Thank you!!

  • Fizz@lemmy.nz
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Pretty cool graph. It was funny to see the two lemmy.World porn communities in a see of lemmynsfw. I was completely unaware lemmy.World hosted porn.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 months ago

    So more dots means more activity total for that communities users on any community in the top 35?

    Wouldn’t a bar graph be sufficient?

  • cabbage@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 months ago

    Very cool!

    Do you be have any idea how tolling scraping these data is for the servers?

    If this is something you want to keep working on, maybe it could be combined with a sort of Threadiverse fund raiser: we collectively gather funds to cover the cost of scraping (plus some for supporting the threadiverse, ideally), and once we reach the target you release the map based on the newest data and money is distributed proportionally to the different instances.

    Maybe it’s a stupid idea, or maybe it would add too much pressure into the equation. But I think it could be fun! :)

    • Danterious@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      2 months ago

      I had to try scraping the websites multiple times because of stupid bugs I put in the code beforehand, so I might of put more strain on the instances than I meant too. If I did this again it would hopefully be much less tolling on the servers.

      As for the cost of scraping it actually isn’t that hard I just had it running in the background most of the time.

      Anti Commercial-AI license (CC BY-NC-SA 4.0)