There have been a report of CSAM and unfortunately the lemmy-safety doesn’t go through the images quickly enough(on my hardware) to be of use in this case.
I think there exists a tool to purge a image via information from a post, but wasn’t unable to find it now. In the future I can hopefully use that tool, when reports of CSAM come in.
You must log in or # to comment.
Thanks for the transparency. Absolutely disgusting, despicable and illegal behaviour from those doing it.