Kevin Tracy
From the Desk of
Kevin Tracy

2023-09-11

Content Moderation Law in California is Unjust and Anti-Free Speech

Freedom of Speech.

Hate Speech is Protected Speech

Before discussing the content moderation law in California, it would be beneficial to first look another series of recent events in Florida. Around the Walt Disney World entrances and near the Disney Springs shopping centers, a group of neo-Nazis has been holding up signs and chanting slogans that are anti-Semetic, anti-homosexuality, and extremely racist in nature. These neo-Nazis are a fringe minority who have as many brain cells as job opportunities. It is undoubtably the most blatant, unveiled, and direct hate speech you're likely to find anywhere in public in the United States. Yet, despite this, the neo-Nazis have been doing this on and off for hours at a time for several weeks with no signs of stopping. Why haven't they been arrested?

Here's part of a statement released by the Orange County Sheriff's Office:

"The Orange County Sheriff's Office deplores hate speech in any form, but people have the First Amendment right to demonstrate. What these groups do is revolting and condemned in the strongest way by Sheriff Mina and the Sheriff’s Office. They are looking for attention, and specifically media attention.”

Not that it should matter, but Sheriff Mina is a Democrat.

Likewise, Altamonte Springs Mayor Pat Bates, a very progressive Democrat, had the following to say about the incident:

"Altamonte Springs is strong, vibrant, and diverse, and hate-filled language won’t change that. Their hate speech may be protected but it is absolutely revolting."

As we all SHOULD have learned in middle school social studies classes, as "revolting" as it may be, hate speech is still protected speech under the first amendment. Our feelings (and common sense) don't negate a person's right to say hateful and ignorant things, provided it doesn't amount to a threat of violence.

Back to California's Content Moderation Law

The law in question is AB 587. It targets social media companies with $100 million/year or more gross revenue to issue multiple reports throughout the year describing the platform's content moderation practices, provide data on the number of objectionable posts, and explain how the posts and their creators were addressed. Failure to comply can lead to a fine of $15,000 per day per violation.

Now, if I were in this situation, my lawyers would be getting rich off my snarkiness. I would share with the state every time someone posted anything even remotely anti-Christian, promoted homosexuality or transgenderism in a way that targeted children, anytime anyone slandered a conservative politician or Church official, or critiqued Catholic doctrine on anything. What is objectionable is ultimately a matter of opinion and as much hate speech is out there over ethnic minorities, bigotry against Catholicism and increasingly Christianity in general is still widely accepted by greater society; and I would love to make a big deal about that in my reports to the state.

Elon Musk, who reluctantly bought Twitter for $44 Billion and changed its name to X takes a drastically different, less snarky approach. He is a free speech "absolutist." In general, you can say anything you want as long as it's not anti-Musk or anti-Musk-owned-entity and he lets users largely police themselves. Is someone posting something racist? You're free challenge them or to block them. Is someone posting something anti-Catholic? We're free to challenge or block them, too. Did I post something that hurt some snowflake's feelings? They're free to block or challenge me. This is largely how society works outside of the internet. If you heard someone say something racist on the street, you're free to challenge them or keep walking and ignore them. In fact, I would argue it's safer on X than on the street because the deranged street racist is more likely to physically attack you if you challenge them.

In fact, I believe it is more advantageous for society as a whole that we be more empowered and incentivized to challenge racism individually instead of institutionally. When an institution blocks your account, there is a tendency to believe you're the heroic victim being oppressed for speaking truth to power. When your family and friends challenge your stupid ideas or block you, you might still think you're the victim, but it feels a whole lot less heroic. Not only does this put real pressure on the racists and bigots, but it also lets the morality of society judge the actions of bigots, and not out of touch California elitists who filter what is acceptable and not based not only on whether or not something amounts to a threat of violence, but whether the view expressed is considered right wing or not. Lies, threats, and bigotry against Catholics are much less targeted (sometimes even celebrated) by social media censors than similar lies, threats and bigotry targeted at homosexuals or those suffering from gender identity disorder. And while much of society does allow or participate in bigotry against Catholicism, society is still divided over the morality of homosexuality and especially gender theory.

Society needs to discuss these problems in free and open forums to resolve them. It should not be led artificially by gatekeepers determining what we can or can't say. However, many social media platforms have taken that approach. Although they let groups like ISIS recruit openly on their platforms (likely because Muslim is a protected status and they choose to extend those protections to terrorists and extremists), social media platforms run by Meta (Facebook and Instagram), YouTube, TikTok, and formerly Twitter prior to Elon Musk would actively oppress conservative views and protect progressives from criticism. More conservatives than are computer savvy know terms like "shadow ban" and have found themselves having to re-subscribe and re-follow their favorite conservative content creators multiple times because of this bias. As bad as this is for society, I'll be the first to admit that these social media companies have the right to do this. It's THEIR platform, and when you publish content on their platform, your content becomes THEIR content. Back when KTracy.com allowed comments, if someone posted a comment I didn't like, I would delete it or prevent it from ever appearing on the site. This is MY website.

KTracy.com is my website. Twitter is Elon Musk's website. Facebook and Instagram are Mark Zuckerberg's websites. YouTube is Larry Page and Sergey Brin's website. All of us have the right to determine what content goes on each of our websites. Our incentive for having good content is that bad content results in less traffic and less traffic means less revenue. If the progressives at Twitter before Musk thought blocking the Hunter Biden laptop story was good for business, they had every right to do that. If I can't be bothered to write about the first Republican Presidential debate (sorry about that), it's to my own peril. Society goes where they want for the content they want to consume. The content they have available should not be limited artificially by the California state government.

By enacting this law, California is trying to force every company lets users interact on their website to moderate content like Facebook or face massive fines. Not only is that wrong, but it will actually only further embolden bigots and racists by convincing them they're the victims of oppression... and in this case, they won't be wrong.

Whether this view is popular or not, hate speech needs to remain protected so society can properly deal with it in its own way. While we shouldn't and can't force social media companies (or any website owners) to tolerate it; governments shouldn't try to force companies to not tolerate it, either.

Elon Musk's now re-named X platform is now suing the State of California over the requirement, arguing justly that AB 587 has violated the first amendment right to free speech. They're not wrong and we hope they win in court for the sake of the freedom of the Internet.

...

Now I have to write a post about why I got rid of comments on KTracy.com. I seriously thought I did that a couple of years ago, but it was only in the History of KTracy.com page.

x