- cross-posted to:
- fediverse@zerobytes.monster
- cross-posted to:
- fediverse@zerobytes.monster
An in-depth report reveals an ugly truth about isolated, unmoderated parts of the Fediverse. It’s a solvable problem, with challenges.
An in-depth report reveals an ugly truth about isolated, unmoderated parts of the Fediverse. It’s a solvable problem, with challenges.
Thanks for the thought you put into your answer.
I’ve been thinking: CSAM is just one of the many problems communities face. E.g. Youtube is unable to moderate transphobia properly, which has significant consequences as well.
Let’s say we had an ideal federated copy of the existing system. It would still not detect many other types of antisocial behavior. All I’ms saying is that the existing approach by M$ feels a bit like it’s based on a moral tunnel vision and trying to solve complex human social issues by using some kind of silver bullet. It lacks nuance. Whereas in fact this is a community management issue.
Honestly I feel it’s really a matter of having manageable communities with strong moderation. And the ability to report anonymously, in case one becomes involved in something bad and wants out.
Thoughts?