I’ve been trying Lemmy for a little while and wasn’t sure how to feel about it.
Today, I wanted to start blocking the most high-censorship instances until I could find a fully zero-censorship instance and simply block all the ones with censorship. Filter bots, not people.
When I looked into it further, I found out there are no zero-censorship instances, because Lemmy relies on a broken “federation” system where each instance is supposed to be able to fetch posts from other instances, but it’s never been finished to reach a fully working state. Lemmy’s official docs say you can’t even do federation over Tor at all. This means it uses DNS, so it won’t actually allow Lemmy instances to fetch posts from each other freely, it just gets blocked instantly and easily, every time the authorities feel like blocking anything.
So you can only ever have the “average joe lemmy” and “average joe reddit” with everything approved by the authorities, and then “tor copies of lemmy” and “tor copies of reddit” where you have free speech but you can only reach other nerds.
People seem to think Lemmy is different because this weird censorship fetish is extremely popular and most of you are happy to see bans happen to certain people, not just bots, so a small Lemmy that censors certain people feels fundamentally different from a big reddit that censors more people. But it’s the exact same thing, it’s reddit.
When reddit was smaller, you could say basically anything you wanted there, they just wouldn’t let it reach the main audience. Then it got too big, and any tiny part of the audience you could reach would be too big, so they won’t let you talk at all.
Lemmy is now the small part of reddit where you can say whatever you want, separated from the main audience, until too much growth happens and you have to move again.
It’s not actually a solution to reddit. It’s not designed to be different, it’s designed to match the past today and then match reddit’s present tomorrow, while being part of a system that’s about the same in past, present, and future.
Last year, this year, and next year, you’re posting somewhere it won’t be seen by many people, and the system that charges people for ambulance rides is getting another year of ambulance ride revenue, facing no organized resistance. There’s no difference here.
Lemmy urgently needs federation between onion service instances and DNS addresses in order to actually do what most users seem to wish it would do: allow discussion outside what the corporate authorities allow, while outgrowing reddit & helping undo the damage social media has done to human communication.


That’s not enough for you ? w.t.f., people are weird, i don’t understand you
Sure, he wants to imprison or kill them but if CSAM is posted on his desired feed, he wouldn’t want it taken down.
That’s his reason.
But again, he repeated that we should make sure that the person is found and condemend, consider that an absence of censorship could help us to find them, censoring them won’t help in reducing the abuses.
Yes, lets assume I am not referring to examples like that here. I’m referring to actual shared CSAM of minors which does exist across the internet.
So you also would want the fediverse to allow CSAM so that the perpetrators can be found easier? That’s your position?
If it could lead to arrests and less pedocriminality, then yes.
One fediverse for you, with censorship, and for fediverse on Tor or somewhere else for him, without censorship.
So should every single site leave up attempts at CSAM?
(It won’t. It’ll make the site a magnet for people spamming CSAM.)
Well then that’s up to him to modify the software. No-one on the Fediverse is interested in doing this.
Man you’re obtuse, pfff, end of the discussion for me.
Yes, you did. Not sure how you think an unmoderated community on TOR being full of CSAM would somehow make it likely to catch extra pedophiles.
You deserve a better answer, and i can be more clear.
In my opinion, CSAM should be banned and automatically reported to authorities, his point was to avoid any slippery slope with simple rules such as “only banning spam”, apart from the other justifications cited above, he also writes « I’m not saying we need to pressure France to arrest a bunch of French film directors for scenes with naked girls as soon as Trump or his replacement finds it politically convenient to label such things as so-called “illegal content.” »
What he’s scared of is these kind of censorship abuses : when you begin to open the door it’s hard to keep it closed.
I.m.o. he should have kept the concept of a simple rule, while extending it to include CSAM, or instruction on how to make a dirty bomb for example.
You’re right to help him see the limits of the freedom of expression, and while i agree with his fear of censorship excesses while censoring CSAM, i also don’t think that not censoring it is the right solution.
An obvious example of excesses based on good intentions would be censoring “hate speech” or “disinformation”.