Here is a report on some of the dangers of P2P and decentralized technologies, namely that tools that achieve some of the goals of #decentralization, #privacy, and strong #encryption have been used by far-right racists.
It discusses the challenges when they seek the same tools we want to promote freedom and equality, and the various ways those problems have been dealt with, and specifically covers #Mastodon and #Scuttlebutt and how Mastodon has been fairly successful.
https://rebelliousdata.com/wp-content/uploads/2020/10/P2P-Hate-Report.pdf
@be @jgoerzen you might find this interesting https://martin.kleppmann.com/2021/01/13/decentralised-content-moderation.html i have found https://cblgh.org/articles/trustnet.html to be especially intriguing, even outside of the moderation problem space
@anarcat @be Thank you, good reading. I've been experimenting on #Scuttlebutt a bit lately, and this is an active topic of conversation there, where each user is effectively an instance
@be @jgoerzen I participated in a workshop way back in 2005 that was targeting malicious network actors (our targets were spammers and botnets back then, of course), and was a co-author on a paper discussing a similar topic:
http://www.icsi.berkeley.edu/pubs/networking/behavioralhistory05.pdf
I haven't read it in a long time, so I'm not sure how it's stood the test of time, but I remember that when we wrote it we were thinking a lot about how to translate something like the PGP web of trust into behavioral attestations.
@jgoerzen it is interesting that pushing fringe groups into their own bubbles amplifies issues. Not being in such public space removes self moderation and rebuttles that might sway some back to the light.
It is an interesting. Classic "freedom of speech allows hate speech" quandry. Only now self regulation is gone.
Eg: in real life, if someone calls someone else a derogitory name (which they have free speech to do so), society has consequences. But not in these closed off spaces.
@psiie Well put. And of course, Mastodon dealt with that very issue with Gab - I guess I'd say, in that case at least, haters were going to hate and at least the rest of us haven't had to deal with it. But yeah, the downside of the Internet empowering everyone to find people like them for niche interests is that people with rather despicable interests also connect. And, at least in Facebook's case, this was profitable.
@john @jgoerzen yea that is sick. Do we know for sure if youtube is the same way? My friend (who i worry about) seems to get videos about pyramid schemes (oils) and qanon. Im over here only getting comedy and science videos.
Its like all the conspiracy stuff is webbed together for him.
Back in my day, ppl only had like one consipracy tops. And it was floride, bigfoot or chemtrails. It *seems* far worst now.
@jgoerzen censorship is not a solution to extremism, but extremism by itself.
@pinkprius @jgoerzen the post I replied to claims a problem with tools being used by the "wrong people" people, not a specific community. He also claims that decentralization was the problem. Only difference to a centralized system is the possibility of cencorship there.
@Jonius @jgoerzen IMO moderation (ie censorship) necessary to form a healthy community in the internet age. it's much easier for people to spam harmful stuff than it is for people in the community to refute it or fight back against it.
deleting posts or banning users sends a message about what is or isn't allowed within a community.
@jgoerzen yea, it's interesting that tools created by people who don't believe in free speech turned out to be so useful for people who do believe in free speech.
@jgoerzen I'd love to see some experimentation with decentralized moderation that uses each user's social graph to approximate trust rather than empowering people who happen to have the resources to run servers to impose their whims on users.