What rules of politeness should we have when discussing political or intellectual issues? Consider the following:
Offensiveness is irrelevant. Only truth matters.
Allowing offensiveness systematically excludes people who are more vulnerable, thereby biasing the discussion.
Outgroup statements are more likely to seem offensive than ingroup statements, so disallowing offensiveness also biases the discussion.
To lower the temperature, we should disallow emotion from our rational debates.
You can't usefully discuss matters involving human beings if no acknowledgement of emotion is allowed. Disallowing "emotion" just favours noncontroversial emotions over controversial ones, since noncontroversial emotions do not need to be vividly expressed in order to be understood and taken as meaningful.
I could go on. The truth is, there is no set of rules of debate that is unbiased, not even "no rules at all".
Historically, a set of intellectual "rules of debating" did exist in the Western enlightenment tradition, and it was useful. Two people who had never met, and who had very different viewpoints, could nevertheless hold an intellectual discussion on a shared footing of "no personal attacks" and "use reason not emotion" and so on. But the reason that system broke down was precisely because it was flawed and, yes, biased, and its purported universality made that bias so much worse than it would otherwise have been.
Within feminist communities, there arose the idea of the "safe space" in which ideas and feelings could be expressed on vulnerable topics. For example, a woman could come to the discussion and say "my boss did this, and I felt violated" without being immediately required to voice a full-fledged defense and definition of sexual harassment as a concept. The notion was a powerful one. It allowed painful truths to be incubated, to be given time to grow definition and defensibility before being forced to face the outside world.
As a discussion tool, safe spaces are invaluable precisely because they broaden the types of discussion that can take place. But safe spaces broaden the discussion by means of a local narrowing, by disallowing certain types of criticism. Indeed, a rule that makes a space "safe" for some people may in fact sometimes make the space less safe for others.
There is no universal safe space, nor should we try to make one. To do so would be to engage in a new version of the fallacy that made the old "rules of debate" so infuriating. "If you can't make your point in this safe space, then it must be hateful and wrong" is just as false as "If your viewpoint can't survive these debate rules, then it must be irrational."
The only way out is to allow multiple sets of rules. That way, truths that are unsayable in one context can still be said in another. Other people can then respond, and the ideas can have the opportunity to be refined or critiqued from the local viewpoint. If we have multiple fora, we can have a system where pretty much anything can be said somewhere.
Ah, but doesn't this just give rise to multiple "bubbles" in which people only hear viewpoints close to them? Well, yeah. I think that's the system we currently have, to be honest. In attempting to break free of the more universal rules that existed previously, a whole set of justifications for narrower rules has been built up. Some of those justifications are even pretty good! But it's given rise to a situation where large numbers of people don't even try to listen to differing viewpoints. Worse still, even if they did try, there are relatively few communities that treat engaging with an outsider as worthwhile in the first place.
The thing we need, and don't have enough of, is overlap. We need ideas to travel from one community to another, changing (and hopefully improving) as they go. In order for this to happen, we need at least some communities to take breadth of represented viewpoints as a local virtue that they try to encourage. Currently, this is rare outside of rationalism, and that's a problem, because a single broad tent is not enough. We need multiple broad civilities in order to ensure that many different types of people have the opportunity to engage with people who are coming at things from a radically different angle.
It is my hope that explicitly acknowledging the usefulness of a pluralist notion of civility will help with this. When we try to argue for a set of norms that are open to enough viewpoints to plausibly be universal, we fail over and over, giving rise to more and more insular communities. If we argue instead for breadth and overlap, we are at least arguing for something that can be achieved. We should encourage people to enter discussions in good faith even if they disagree somewhat with the local norms of engagement, knowing that norms should differ from forum to forum, so it's not wrong to allow different sets of norms to stand.
Here, then, is my (local) pluralist manifesto.
Respect that discussion norms are local. Don't try to make them universal.
Be part of the overlap. Belong to more than one community.
Encourage other people to recognise that discussion norms can and should differ from place to place.
Encourage other people to recognise that broad discussion norms are incredibly valuable and should be nurtured wherever they are compatible with community aims.
I posted the above on my old blog exactly six years ago. I have been thinking I would probably migrate some of my previous blog and reddit writing over here, and there are good reasons to post this one now, because Substack itself is currently engaged in some heated debate over its own rules of engagement. Back in November, Jonathan Katz reported in The Atlantic that Substack hosts (literal) Nazi content that has been disallowed from other forums. This gave rise to calls for such content to be removed or at least de-monetized, and counter-calls for Substack not to exercise control over the content of people’s writing on the site.
Hamish McKenzie has recently clarified on behalf of Substack that the site will not be making changes on the basis of this controversy. This has given rise to considerable criticism. In particular, Ken White has claimed that this cannot possibly be a principled “free speech” decision:
Substack is engaging in transparent puffery when it brands itself as permitting offensive speech because the best way to handle offensive speech is to put it all out there to discuss. It’s simply not true. Substack has made a series of value judgments about which speech to permit and which speech not to permit. Substack would like you to believe that making judgments about content “for the sole purpose of sexual gratification,” or content promoting anorexia, is different than making judgment about Nazi content. In fact, that’s not a neutral, value-free choice. It’s a valued judgment by a platform that brands itself as not making valued judgments. Substack has decided that Nazis are okay and porn and doxxing isn’t. The fact that Substack is engaging in a common form of free-speech puffery offered by platforms doesn’t make it true. [Source.]
White has been widely referenced by others, including Radley Balko and Jonathan Katz himself, but I don’t find White’s argument to be quite so clear-cut. McKenzie is wrong to say that this is a matter of “civil liberties” and White is correct to point that out. However, there is in fact a valid underlying principle here, even if McKenzie has made errors in how he expresses that principle. We can read Hamish McKenzie’s calls for “freedom of expression” and “open discourse” as being made in the service of ideological breadth rather than absolute freedom. Substack itself allows for considerable variety in its sub-communities, and it makes perfect sense that on the top level it would — and should — use very broad norms.
Now, with that said, if it were up to me, I would almost certainly de-monetize the (actual, literal) Nazis and quite possibly remove them from the site. Margaret Atwood argues convincingly that Nazism is widely and unambiguously understood to be an ideology that stands for mass murder, and that it therefore violates Substack’s terms of use as they currently stand. Construing Nazism as a rule violation in this way does require some interpretation, but all rules require interpretation to some extent. I think this interpretation is reasonable, and if it were me I would adopt it. This would only be a small narrowing, and I think it would be the correct call.
With that said, I am not going to leave the site just because I disagree with how it interprets its own rules. Frankly, I’m quite glad I don’t have to make this judgment call, which could rightly involve all manner of principle and business and public relations rolled into one. Ideological breadth is an important goal, and I can respect a decision made in the service of that goal even when I don’t agree with it.
Given all the people in the current political debate who are very willing to apply the label "Nazi" to a wide range of people who disagree with them, I'd rather put up with there being a few "literal Nazis" on the platform (who I don't subscribe to, so they don't impinge on me) than have an ever-expanding purge of anybody who criticizes the fashionably woke ideologies of the day, or can be tarred with guilt by association with somebody who does.
It's said by anti-anti-Substackers that alternatives like Ghost have even fewer restrictions, but haven't been subjected to the same ire as Substack. I assume this is due to Substack's higher profile.
Unfortunately, Ghost's name makes searching difficult and I'm not particularly interested in poking around for literal Nazis on their platform at this time. In theory Ghost would ban Nazis among many other groups if 2.2.g and 2.2.h were enforced in reasonable ways, https://ghost.org/terms/, but such rules have a tendency to be selectively enforced, as Substack is (likely) doing.
>Margaret Atwood argues convincingly that Nazism is widely and unambiguously understood to be an ideology that stands for mass murder
What a finely-crafted and offensive standard. We treat evil like a hedgehog instead of like a fox, to borrow Berlin's model, and we miss much because of it.
She's convincing on simpler terms that Substack should enforce rules if they're going to have them at all- less so that we should be so much more comfortable rubbing shoulders (to whatever extent sharing a blogging platform is "rubbing shoulders," so much more so than using the internet at all- which highlights the real goal of such efforts) with people ambiguously supporting mass murder, or those who hold mass-murderous ideologies yet for whatever reason remain socially tolerable.
>This would only be a small narrowing, and I think it would be the correct call.
Too small a narrowing, perhaps.
I too am glad to not be in a position to choose principle or PR, to not deal with hateful mobs complaining that I'm not doing enough about the hateful mob across the street, and I do not envy Hamish et al or the Cloudflare guy to deal with them. Is this the price we pay for ideological breadth? Voltaire comes to mind, and I fear what comes after if we forget that ideal. GermStack would be a smaller, quieter place than Substack- but I don't think Katz, Atwood, et al would actually like it any more despite the absence of Nazis, given who else might be caught in that net.