mirror of
https://github.com/andrewgioia/blog.git
synced 2024-12-23 10:39:55 +00:00
129 lines
17 KiB
Markdown
129 lines
17 KiB
Markdown
|
---
|
||
|
title: Boycott Facebook, but not because it supports free speech
|
||
|
linkTitle: Boycott Facebook, but not for its free speech
|
||
|
slug: "facebook"
|
||
|
date: 2020-07-01
|
||
|
publishdate: 2020-07-05
|
||
|
---
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">
|
||
|
Of all the very rational <a href="#reasons">reasons to boycott Facebook</a> or any commercial social media platform, <a href="https://news.ycombinator.com/item?id=23627905" target="blank">that "its support of free speech" has taken hold</a> is alarming.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
**Two obvious disclaimers first:** Facebook isn't Congress and has no legal requirement to adhere to any concept of free speech, and The Advertisers similarly may choose not to do business with Facebook for any reason at all. When I and many others talk about "free speech," however, we pretty much always mean Free Speech, not the First Amendment: the American value, moral standard, and human right that we instinctually hold ourselves and others to.
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="swqm"><em>"Private companies can do whatever they want"</em> is problematic for many reasons, and here it's being used unfairly as a sword to compel Facebook to govern certain types of speech on its platform. <strong>This is a huge mistake with irrevocable damage,</strong> and if free speech as a moral guideline doesn't persuade you then the realities of voluntarily privatizing "truth arbitration" absolutely should.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
The answer is simple on its face but difficult given the decade of operant conditioning Facebook and others have been perfecting: **correct and re-frame how we consider and use social media in the first place**.
|
||
|
|
||
|
## Problem 1: it's way too hard to determine what's "true" and what's "fake" {#arbitrate}
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">Social media by design cannot arbitrate truth, and any path with this as its goal is necessarily doomed.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
At some point Facebook, Twitter, YouTube, and Reddit graduated from personal networks to global platforms that broadcast millions of messages to millions of people every day. Though we still do not admit this, it was at this point they also transformed into _communication platforms_: core communication infrastructural architecture. Facebook became a phone carrier, and if Verizon can't and doesn't police the content on its phone lines, Facebook shouldn't either.
|
||
|
|
||
|
There are too many messages from too many people at too fast a pace for any reasonable human moderation to enforce rules consistently. The speed and nuance with which news and opinion have evolved have made this an order of magnitude more difficult as well. A headline's truthiness can turn on a _word_. 25,000 users like it and a new headline replaces it 30 minutes later. Multiply that by thousands every day and if determining truth was even possible, the scope becomes insurmountable anyway.
|
||
|
|
||
|
It's unfortunately just impossible though. Two rational adults at opposing ends of the political spectrum can come to opposite conclusions as to truth for most political headlines. Just from today, at random, from CNN, and looking solely at the headline content:
|
||
|
|
||
|
* ["Trump commutes Roger Stone's sentence"](https://www.cnn.com/2020/07/10/politics/trump-stone-prison-clemency/index.html) is factual and easy.
|
||
|
* ["Debunking 12 lies and falsehoods from the White House statement on Roger Stone's commutation"](https://www.cnn.com/2020/07/10/politics/fact-check-white-house-statement-roger-stone-commutation/index.html) is decidedly less so. This response directly contradicts a previous article, who decides if they're lies? Which is "true" when 2 ostensibly rational adults differ this wildly on _12_ issues of fact?
|
||
|
|
||
|
This doesn't even approach actual content examples and nuances. If one of those 12 actually is true is the entire article deemed fake? Does Facebook redact it? When do creative omissions in long quotes or testimony go from "helpful clarity" to "outright misrepresentation"? If a [news outlet describes a presidential speech as "divisive"](https://www.nytimes.com/2020/07/03/us/politics/trump-coronavirus-mount-rushmore.html) but a moderator or large group of users feel differently, is that article "true" or "fake?" Who decides if a message is merely satire or an actual actionable threat or call to violence?
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="swqm"><em>"But just because it's hard shouldn't mean we don't try. Plus, machine learning and 'algorithms' can do the work of a thousand human moderators."</em></p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
Determining truth is a problem legal systems have dealt with for millenia and ours has only "recently" arrived at an expensive, adversarial, intensely thorough, and months-long effort with a due process backbone and an ultimate unanimous determination by 12 other humans. This is obviously the extreme as lives can be on the line, and merely deciding to nuke certain news stories pales in comparison, but **this is how hard it is to reliably determine truth!** We aren't remotely close to handing this over to machines.
|
||
|
|
||
|
## Problem 2: commercial social media platforms have become the new public square {#public-fora}
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">Facebook, Twitter, YouTube, and Reddit are not just carriers, <strong>they're also the town square, courthouse steps, public parks, and shopping malls</strong>. They should not regulate content because of this, and hopefully one day they won't be able to.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
As much as I've hated to admit it, these platforms have become the place where communication happens. People assume to find information here. Our president first tweets official state announcements. If a video isn't on YouTube, outside of certain niches it may as well not exist.
|
||
|
|
||
|
Commercial social media platforms have [usurped print](https://www.pewresearch.org/fact-tank/2018/12/10/social-media-outpaces-print-newspapers-in-the-u-s-as-a-news-source/) ([and even television](https://www.pewresearch.org/fact-tank/2018/12/10/social-media-outpaces-print-newspapers-in-the-u-s-as-a-news-source/)) as the primary source of news for most American young people, [despite most saying it's worse](https://www.niemanlab.org/2019/10/more-americans-than-ever-are-getting-news-from-social-media-even-as-they-say-social-media-makes-news-worse/)!
|
||
|
|
||
|
If these platforms are increasingly becoming the _only_ place where voice or opinions are heard—and where public policy and political issues are presented, illuminated, and debated—they've become _de facto_ public fora and should be treated as such. [Sadly they're not yet actually treated this way](https://www.bbc.com/news/technology-51658341), but probably will be if they continue down their censorship paths.
|
||
|
|
||
|
The knee-jerk response to this has always been that "these are private companies" and they can control their private businesses however they want. [Much more eloquently](https://thehill.com/opinion/technology/456900-government-regulation-of-social-media-would-kill-the-internet-and-free):
|
||
|
|
||
|
> [S]ocial media companies are private companies, not government actors, and these companies have their own First Amendment right to exclude anyone from their platforms for any reason at all. The government cannot force these companies to open up their sites and associate with viewpoints that their owners and shareholders find objectionable, any more than it can force you to display government-approved speech on your private property.
|
||
|
|
||
|
These arguments sound great at first but quickly fall flat for 3 reasons:
|
||
|
|
||
|
* **We already have legal and social precedents for extending 1A requirements to certain private entities.**
|
||
|
|
||
|
For almost 100 years, phone companies and other types of common carriers have had constitutional requirements extended to them. In telecom's case, phone companies have to provide basic service to everyone at a fair price and without discrimination, and they're regulated by a separate federal entity. We're currently debating whether ISPs should also fall into this category (they should), and it's not inconceivable that "public utility" social networks could and should.
|
||
|
|
||
|
Network effects and increasingly easier acquisitions are soon making sure that these platforms are _the_ communication platforms in a public utility sense. And as all "real" communication becomes centralized here, the case that we treat them as infrastructural "dumb pipes" gets stronger and stronger.
|
||
|
|
||
|
* **It begs the question and presents a false dichotomy in presuming there must be censorship.**
|
||
|
|
||
|
There is no requirement that we must choose between Facebook censoring lawful content or the government censoring lawful content; a third option where no one censors lawful content also exists!
|
||
|
|
||
|
Rhetoric that presumes censorship, like "Do you trust government bureaucrats to police social media and decide whether content is too politically 'biased'?" is leading and presents a bad false dichotomy. Compelling Facebook to be content-neutral and treating it as a dump pipe does not necessitate "government bureacrats" doing the regulation. There's a reality where we police unlawful content and harassment just like everywhere else, and if we remove that presumption and the threat of scary bureacrats goes away entirely.
|
||
|
|
||
|
* **Though they're private companies, they're abusing current content liability exemptions and are not being fair.**
|
||
|
|
||
|
Facebook and others have been slowly increasing their abuse of [Section 230 protections](https://www.eff.org/issues/bloggers/legal/liability/2300). They have no responsibility when their users post harmful or illegal things, ostensibly claiming that moderation is too difficult, but some will moderate content when and how they please. This has been getting worse, not better, and
|
||
|
|
||
|
While it's possible that Facebook, when told to either remain neutral or face content liability, turns the censorship dial to 11, this is probably correct and probably not all downside either. Platforms cannot have it both ways, and if they want to undergo content-level censorship and become arbiters of fact then they _should_ open themselves to liability. This would open them up to competition from new platforms, or even a publicly run platform, where content is not restricted—competition that's being stifled right now because commercial platforms get a huge advantage by having it both ways.
|
||
|
|
||
|
Without any action soon, commercial social media platforms will become further entrenched as utility-level services in all but name, while also continuing to push out traditional forms of news and media at the same time.
|
||
|
|
||
|
## Problem 3: private companies should not unilaterally decide acceptable public speech {#forfeit}
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">Perhaps the most confusing aspect of The Advertisers Boycott, as well as much of the defense of social media censorship over the past few years, is that <strong>we're voluntarily forfeiting centuries of speech protections</strong> into the hands of an oligopoly beholden only to the interests of their shareholders and officers.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
Continued requests for Facebook to censor its users' content just hands them the keys to controlling acceptable topics and opinions. And as it becomes more normal for them to delete "offensive" content, we consolidate this power and come to expect it from them, giving up our own responsibility in discerning what's true and conditioning ourselves to rely on corporations to do this important work for us.
|
||
|
|
||
|
Corporations have vested interests that are often very much opposed to public interests. Their reach and dominance can quite literally make a market or sway an election. Giving corporations the power to render a search query return no results, bury a story from appearing on social media at all, hide video evidence of some event, or literally edit the content of the messages posted on their platform is frightening in its societal implications and potential for abuse. The scale of this type of censorship is unmatched and we have no control or due process over them. These are powers that no private entities should ever possess.
|
||
|
|
||
|
Just as bad an outcome is that we kill our instinct to question the things that we read and see, instead assuming they've been vetted or preapproved and taking them as fact. The cost for this convenience—both consolidating this power into private groups and giving up our own judgment—is also far too high. Over time as we come to expect and await our corporate censors to approve the news we read and messages we share, we'll trust that what we do read is somehow "verified" with a comfy checkmark. There's no need to read opposing viewpoints, if they're even able to be seen.
|
||
|
|
||
|
**To be fair, it's an impossible situation for private corporations offering a public utility.** They have tremendous pressure from every angle to perform often opposing actions. Flagging a story or not flagging a story creates an opinion. Censoring "hate speech" draws a line and tacitly approves hateful things not yet banned. Political parties and interest groups report each others' content as fake. Enabling true free speech fosters actual debate but impinges requests for safe spaces; censorship gives users their bubble but creates groupthink and echo chambers.
|
||
|
|
||
|
These utility-level platforms shouldn't just shouldn't just be banned from censorship for power level reasons, **they shouldn't attempt it _because it's impossible_**. It is simply an impossible balance to maintain that no private entity is equipped to handle.
|
||
|
|
||
|
## A solution: decentralize and re-frame our concept of social media {#solution}
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">Social media platforms should revert to just that: <em>social</em> communication channels and communities to share media and information, concomitant with a reevaluation of them as leisure activities, not authoritative sources.</p>
|
||
|
{{< /html >}}
|
||
|
|
||
|
This alone is a win, but the further pipedream would be to decentralize them so that the current monopolies are one of many different ways to access content on a federated protocol. Or, at the very least, offload centralized censorship to local groups or client devices.
|
||
|
|
||
|
### Social media is a toy and we should treat it that way
|
||
|
|
||
|
Facebook, Twitter, YouTube, and Reddit created incredibly engaging social experiences. They no doubt continue to exploit human pyschology to do so but nevertheless they've succeeded many times over in creating global communities that keep people coming back _a lot_. While they became incredibly popular, however, their authority somehow also grew with them and we've completely forgotten their founding as fun social activities. **This was a huge mistake.**
|
||
|
|
||
|
These sites are fun to use but are woefully inadequate as "serious" communication tools; treating them as such and censoring content so that they can remain authoritative is a fool's errand. Re-framing them as social websites removes all of the pressure they have to censor and regulate speech. Delegating responsibility to maintain order onto the small local communities within them also relieves significant pressure. Reducing instead of maximizing the degrees of relationships from whom users see content keeps it more _social_. Seeing it as fun might also help us to not get offended over everything we see.
|
||
|
|
||
|
No one petitions Snapchat or Discord or even Instagram to censor speech, in large part because they're still seen as fun and not authoritative sources of information or where our _president_ feels the need to make official announcements.
|
||
|
|
||
|
### Decentralization, though a pipedream, is the true fix
|
||
|
|
||
|
When Grandpa sends an offensive email forward, Email, Inc. doesn't ban him from Email. We delete it at first and if it starts to become too annoying we filter it, tell him to stop, or block him on our personal block list.
|
||
|
|
||
|
There is no corporate entity controlling Email with its centralized Email servers, shareholders requiring 10% growth every quarter, dark patterns driving Email adoption and use, and datamining. If we want to create an email account we don't have to do it on Email.com, we can do it with any provider (or ourselves!) as long as we interoperate over a published email protocol.
|
||
|
|
||
|
Replace "Email" with Twitter in the preceding paragraph and we have the real fix to social media's censorship problem, among many other problems. When Facebook, Twitter, YouTube, and Reddit are merely content aggregators, filters, and user interfaces over their respective networks, censorship is moot. If you want a safe space on the official facebook.com or twitter.com instances then you are absolutely entitled to that, but the censorship there would not then impact what I get to read and engage with over the `FB://` or `TWTR://` protocols.
|
||
|
|
||
|
This is federated, decentralized communication over an official protocol. There are attempts to do this now [that I fully support](https://gioia.social/@andrew) and would love to see grow, but as long as the monopolies remain unfair monopolies they have too steep of a hill to climb.
|
||
|
|
||
|
## Bonus: better reasons to #BoycottFacebook {#reasons}
|
||
|
|
||
|
{{< html >}}
|
||
|
<p class="big">If you've made it this far or came directly here to look for some great reasons to boycott Facebook or any commercial social media platform, here's a nonexhaustive list to get you started!</p>
|
||
|
{{< /html >}}
|