The Limitations of Relying on Volunteers for Moderation
1. The System will Miss Falsehoods and Could Amplify Hateful Content
There is a real risk under this style of moderation that only posts about things that a lot of people know about will get flagged in a timely manner—or at all. Consider how a post with a picture of a death cap mushroom and the caption “Tasty” might be handled under Community Notes–style moderation. If an expert in mycology doesn’t see the post, or sees it only after it’s been widely shared, it may not get flagged as “Poisonous, do not eat”—at least not until it’s too late. Topic areas that are more esoteric will be undermoderated. This could have serious impacts on both individuals (who may eat a poisonous mushroom) and society (if a falsehood spreads widely).
2. It Won’t Work Without Well-Supported Volunteers
Meta’s paid content moderators review the worst of the worst—gore, sexual abuse, and violence. As a result, many have suffered severe trauma, leading to lawsuits and unionization efforts. When Meta cuts resources from its centralized moderation efforts, it will be increasingly up to unpaid volunteers to keep the platform safe.
Community moderators don’t have an easy job. On top of exposure to horrific content, they are also often subject to harassment and abuse. However, community moderators moderate only what they can handle. For example, while I routinely manage hate speech and violent language, as a moderator of a text-based community I am rarely exposed to violent imagery. Community moderators also work as a team. If I do get exposed to something I find upsetting or if someone is being abusive, my colleagues take over and provide emotional support. I also care deeply about the community I moderate. Care for community, supportive colleagues, and self-selection all help keep volunteer moderators’ morale high(ish).
Conclusion
In conclusion, relying on volunteers for moderation is not a foolproof solution. While community-driven moderation systems like Community Notes have their benefits, they also have significant limitations. To make these systems effective, it is crucial to ensure that experts are involved in the moderation process and that volunteers are well-supported and equipped to handle the challenges of moderating online content.
FAQs
Q: How do community-driven moderation systems work?
A: Community-driven moderation systems rely on volunteers to flag and review content, with the goal of promoting a positive and respectful online environment.
Q: What are the limitations of relying on volunteers for moderation?
A: Relying on volunteers for moderation can lead to a lack of expertise in certain areas, a lack of resources, and a lack of support for moderators.
Q: How can we make community-driven moderation systems more effective?
A: To make community-driven moderation systems more effective, it is crucial to ensure that experts are involved in the moderation process and that volunteers are well-supported and equipped to handle the challenges of moderating online content.