It not only have problems with CSEM (the real-life stuff), but there are now bots spamming it and Twitter made reporting it a chore.
“Please provide more context”, WTF it’s literally just CSEM plus a link I won’t click even if my life depends on it!
I’m quite sad since a lot of the creators I’m following are still only there, or on the boneless fediverse app BlueSky (which is worse in some ways), and I still need to keep it around just to protect my user handle there and to look things up from time to time.
Once I’m at home from work, I’m locking my account, and put up a farewell message to whoever might miss me.
I’m not saying that the fediverse is perfect, far from it (especially certain segments of Lemmy), but it’s a way better experience than whatever Xitter (or Reddit for that matter) tries to be. I even have more reach, especially since the whole paid blue checkmark thing.
What’s CSEM, is that like CSAM?
Yeah, can we just agree to stop using unexplained acronyms? Even as a terminally online person, I struggle to keep up with the new ones that keep popping up daily and it’s exhausting. Some time ago, I also had to look up what CSAM meant because suddenly everyone was saying it out of nowhere and it was critical to the context.
It makes some sense to me in that some media might contain any actual abuse, e.g. images generated and shared publicly by underage teenagers without any coercion. I think most of us would still consider it exploitative for other people to share and view that media.
Might be “exploitation” instead of “abuse”.
I still didn’t get why we stopped calling it CP instead of CSAM or CSEM or whatever it is now.
Yeah same. I first remember hearing it when Apple was planning that amazingly invasive local scanning of user images. Now it seems to be everywhere.
I’m not against it though. CP could’ve described multiple things and this one is a lot less mistakable when you know. CP wasn’t particularly intuitive either— no easier to decipher, merely that with years of use many people knew it— so it’s an upgrade overall I think.
Another benefit is that it includes “abuse” in the name. That’s important and ensures the people who seek that stuff out won’t borrow the term like they did CP.
They stopped calling it CP. As Porn is typically consensual and is typically what people think of when they think of Porn, they’re not thinking it’s two legal persons where one has not consented. Thus to avoid any confusion or possibility of downplaying its severity they changed the term from CP.
I’ve also seen it used as niche industry abbreviations, which made me very uncomfortable at first, regardless of it being disused
supposedly because the porn part makes it sound consensual
I would think the “child” part would rule that out.
Lol exactly
I don’t think that really makes sense because revenge porn is technically porn and it’s not consensual.
So I would disagree calling something porn doesn’t imply consent, it merely describes sexually explicit material.
usually the porn part is consensual and only the sharing is not, but I see what you mean
CSAM is supposed to be more explicit that the images are essentially crime scene photographs, and to emphasize that it is Abuse first and foremost and not merely pornography.
CP is a morally neutral term, or at least the components words themselves are. CSAM is not, and is explicitly negative.
Hmm… I mean I’m not challenging this explanation, but I’m just a little curious about this I suppose? So starting from when I was like 13-14, I regularly sent and received nudes of other people my age I met on gay forums n shit. Uk… Sexting n stuff. Now I know that this could’ve gone incredibly ugly had I been deanonymized n stuff. But I mean… I had fun at the time and am in contact (not that regular tho) with some of these guys (and I’m an adult now).
I had fun at the time and was not coerced into anything by anyone. I was just a horny teen with an out and so were they. How’s this abuse? Like who’s the abuser? I’m sure it wasn’t us, as no one coerced anyone into doing anything.
That’s I guess why CSEM is used, because if the images are being shared around exploitation has clearly occurred. I can see where you’re coming from though.
What I will say is that there are some weird laws around it, and there have even been cases where kids have been convicted of producing child pornography… of themselves. It’s a bizarre situation. If anything, seems like abuse of the court system at that point.
Luckily a lot of places have been patching the holes in their laws.
I think it’s part because of definition of porn, and pedos were like “but what about erotica?”, and there were CSEM that used such clauses to get around bans, essentially by claiming they’re artistic nudes.