cross-posted from: https://beehaw.org/post/6795142

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • 0x1C3B00DA
    link
    fedilink
    101 year ago

    I have argued for a while that the Fediverse is way behind in this area; part of this lack of tooling and reliance on user reports, but part is architectural. CSAM-scanning systems work one of two ways: hosted like PhotoDNA, or privately distributed hash databases. The former is a problem because all servers hitting PhotoDNA at once for the same images doesn’t scale. The latter is a problem because widely distributed hash databases allow for crafting evasions or collisions.

    -- https://hachyderm.io/@det/110769474386499134

    This is from the study’s author (here’s the full thread). It shows how pernicious centralization is in technology. The author is claiming the fediverse is “behind” instead of the tools behind behind in supporting decentralized services. They were developed with only centralized Silicon Valley silos in mind and now they can’t keep up with a decentralized infrastructure and the authors solution is for decentralized services to centralize around these tools.