Substack’s Nonsense

Read to the end for a post about Email Roadblocks.

Editor's Note: Thanks to everyone who completed the TimeMachiner reader survey. I'd like to announce that "JKresh" is the winner! Please check your inbox or spam folder for an email I sent last week so I can send you your prize!


2024 is upon us and I'm excited to get a new year of TimeMachiner underway. One thing that I've been quite happy about are some key past decisions I've made about how I've set up this newsletter. Mainly, I decided at the beginning that I would not use Substack. That single choice has saved me a lot of work if I'd have to migrate away.

Also, hello to everyone who was on my experimental TimeMachiner publication that was on Substack. I've moved you over here to the proper home of this newsletter. Why? Let's dive in.

Substack is a newsletter publishing platform that's been around since 2017. The goal is to make it easy to write and publish newsletters. It promised zero setup and fuss. Simply write and publish. But over the years they've changed things around. There's a recommendation engine. There's a Twitter-style clone called "Notes". There's an entire payment system wherein you can ask people to pay to support your work (akin to Memberships here). Money is pooled in some fashion and shared among other paid newsletter writers.

Near the end of last year though, something changed. In November there was an article in The Atlantic putting a focus on the fact that Substack has a "Nazi problem". The TL;DR of the situation is Substack has near-zero moderation and people who hold... certain extremist views have been allowed to not only freely publish their thoughts but profit from it too using Substack's payment system.

This led to 100 writers of some of the biggest newsletters on Substack to pen an open letter to its founders asking what would be done. On December 21, Substack co-founder Hamish McKenzie wrote a "note" with an answer: absolutely nothing.

McKenzie takes 546 words to say what essentially was "no" to moderating Substack. Even with blatant white supremacist content displaying nazi imagery freely being part of Substack, the point was clear. Substack has taken the position that moderation of any hate speech is wrong. Here's what I took away as the key to his position:

As @Ted Gioia has noted, history shows that censorship is most potently used by the powerful to silence the powerless.

We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power. We are committed to upholding and protecting freedom of expression, even when it hurts. I think it’s important to engage with and understand a range of views even if—especially if—you disagree with them. 

McKenzie has conflated a freedom of expression and civil liberties with the responsibilities and latitude of running a private platform. While, yes, we can all agree discourse is important in the US and the freedoms we have to have that discourse is vital to the strength of a democracy. But, and this is a big but: freedoms of expression and civil liberties DO NOT APPLY to private platforms. They never have. And there is never a need for discourse or debate about nazis, white supremacists, or the holocaust.

That is where the argument falls apart and ends for me.

Moderation online has been around for decades. From forum and chat room admins to Facebook's Content Moderation Team to Twitter's long-disbanded Safety team. Moderation isn't bad. It's essential for a platform to survive as a safe place. And once you give a certain group the inkling its allowed to hang out on your platform, it's Game Over.

Substack thinks even demonetizing those publications is a step too far. Meanwhile publications with adult content are moderated on Substack. Consistency is not exactly a thing there.

Even with late breaking word that Substack will remove some nazi content, their response continues to prove they don't get it.

The company will not change the text of its content policy, it says, and its new policy interpretation will not include proactively removing content related to neo-Nazis and far-right extremism.

Platformer's 1-8-24 Issue

Now, one thing I did do at the beginning of last year was set up a "baby" version TimeMachiner on Substack. The idea was to occasionally repost some of my work in the hopes of reaching new readers. I'd hope to bring them here to TimeMachiner proper. Until all this nonsense happened last month, I was underwhelmed with Substack. It has lax spam controls, buries any writer under a pile of tens of thousands of other small writers in order to promote the big publications. Generally it is a place where writers talk to other writers. The idea of 'directly reaching readers' is a whole lot of nonsense. Of the many subscribers I had there, tons of the emails were obvious fakes.

I had been considering closing up shop there because it was time wasted without results. Once McKenzie's Note was published, the decision was made for me. I exported the subscriber list and deleted the publication and my account. It was the only thing to do.

And I'm not alone. Many big publications are moving, have moved, or are considering it. I don't know what there is to consider, given the co-founder basically said hosting, supporting, and paying white supremacists is okay by them. If you're a Substack writer or reader, I'd ask that you think long and hard about where the platform is and where it's going. Because without a sea change in the coming weeks to implement any form of moderation, Substack will remain a safe haven for those people.

Thanks for reading. Now, onto rest of today's issue.

-Aaron