As we approach the U.S. election, misinformation and abuse are on the rise and more people are fleeing X in search of something safer and more trustworthy. This is an incredible opportunity for the #fediverse but only if we are able to maintain and expand the effectiveness of our #moderation as we scale.
In this latest episode of Dot Social I go deep with two well known trust and safety experts who are advancing decentralized moderation for the fediverse: @samlai.bsky.social and @jaz. We discuss strengths and weaknesses today as well as what the future could hold.
Check out the conversation on our PeerTube instance or wherever you get your podcasts.
@mike @samlai.bsky.social @jaz
will listen but so far I'm not sold on moderating for misinformation being a good idea. absolutely for harassment.
@wjmaggos @samlai.bsky.social @jaz what are your thoughts on why moderating misinformation might not be a good idea?
1. adds a lot to admin workload
2. inherently political and divisive
3. makes choosing a server more difficult
4. impossible to do well
5. we are the open web, not an editorial page
I'm a dissident lefty. the culture here imo is very establishment left. Dem party basically. my hope here is to have the greatest public forum ever but we just had people very upset NYT was calling Biden too old post debate. if we were more diverse, bringing that up pre debate could easily get moderated away.
1. admin & moderator workload is real. reducing harms to users is real.
2. moderating false information which may mislead or increase likelihood of harm is not inherently divisive, and is sometimes required by law
3. only for people who want to consume misinformation
4. impossible to do perfectly, but certainly able to be done well
5. the server you administer publicly blocks 2 servers. this suggests you, along with most service providers, have applied editorial standards.
I'm not arguing for consuming misinfo. I block for harassment. And yes, misinfo can lead to real harm.
But the way we best figure out what's true is dialogue and sharing good journalism/science. Not admins being referee. What I fear they will usually do is take the current consensus position, with a bias towards their user base. Alternative but sometimes correct views will struggle to find a place they can be posted that is well federated to go viral and correct misunderstanding.
>I fear they will take the current consensus position
Some will, some won't. Some will focus not only on misinformation and harassment, but also on the numerous other issues that also warrant intervention to reduce harm.
Here's 34 of them: https://connect.iftas.org/library/iftas-documentation/shared-vocabulary-labels/
You are free to operate your service exactly as you please.
So is everyone else.
1/2