After months of battling with hosting and other service providers to keep its controversial enterprise alive, Gab—the “free speech” Twitter alternative—was finally forced offline—at least, temporarily, according to the static message that has adorned its homepage for the past few days.
The final straw came after it was discovered that Robert Bowers—the alleged gunman responsible for the Squirrel Hill massacre—had boasted of his murderous intentions on his Gab profile shortly before the shooting. Before then, Bowers had provided a steady stream of antisemitic and generally hateful dispatches—a sight not uncommon across the freedom-loving platform.
The eventual “mass deplatforming” of Gab, regardless of recent events, was, however, completely “predictable,” says Jameson Lopp, a well-known privacy advocate and self-described “professional cypherpunk.” “If they want to continue to exist,” he says, “then they will need to remove any single points of failure from their infrastructure to prevent it from continually reoccurring.”
The moral of the story:#IPFS#Bitcoin https://t.co/5fg12P7ZpD
— Ⓥin Ⓐrmani (@vinarmani) October 29, 2018
While Lopp points to The Pirate Bay and its “redundant infrastructure all over the world” as a possible template to follow and says decentralization need not necessarily be achieved through blockchain, the possibility begs the question: What would a blockchain-based, fully decentralized, truly censorship-resistant social media platform actually look like? And are we prepared for the consequences?
In theory, the libertarian dream of utterly free speech sounds great: content lives free and all ideas—both good and bad—do battle in an open and immutable gladiatorial arena of opinions. What’s not to love? In practice, however, we may just as likely end up with something more closely resembling an episode of “Black Mirror”—a technocratic nightmare of noise, where hatred, vitriol, and worse spreads unchecked.
Indeed, these scenarios are not necessarily mutually exclusive, since a “free and open Internet” would—again, theoretically—be whatever individual users collectively make of it—digital playgrounds then no longer shaped in the image of Silicon Valley, which is, after all, a big part of the appeal of Web3.
Among those leading the charge in this direction is Minds, an “open source and decentralized crypto social network.” Bill Ottman, its co-founder and CEO, says the network is constantly pushing to decentralize further, and while it isn’t “uncensorable” yet, it wants to be. Nevertheless, Ottman says Minds takes violence and threats of violence very seriously and currently has “reporting tools” in place to help protect its roughly 1.25 million users.
“In the end, we believe that transparency and protection of non-threatening user speech is the only way forward in this new age of social media and the Internet,” he says, adding that Minds is “proud to stand up for what’s right—both by safeguarding our users’ rights to freedom of speech and continuing to disallow calls to violent action.” Ottman admits, however, that the immutability of blockchain presents an interesting paradox—content published to a distributed ledger can’t just simply be deleted, which has “both privacy and content-control implications,” he says.
In other words, the same technology that can be used to liberate information and empower its creators can also ensure undesirable and potentially harmful content is inexorably preserved. Ottman says Minds aims to “give users as much control as possible, which probably looks like a hybridized solution, ultimately,” which includes making use of technology that can help users better control their individual experiences online through filtering tools and the like.
The sentiment is echoed by Lopp who says decentralized digital townhalls of this sort many not be a matter of if but when. “It's hard to imagine online speech being even more centralized than it already is,” says Lopp, “so I think the pendulum will swing toward decentralization.” The question, then, is: how far?
“A truly censorship-resistant platform would enable you to publish anything you wanted onto it, and no one would have the power to stop your data from spreading to those who subscribe to read it,” says Lopp. The idea, he explains, is to create a network that requires “kicking in more doors than is physically attainable” to shut it down. And a “public, permissionless blockchain platform that is being run by thousands or more entities globally” would definitely do the trick.
Suddenly the long arm of the law doesn’t seem to reach too far, as content on these decentralized platforms could be made to survive the reach of governments and their enforcers. It is, understandably, a horrifying thought for some, given the events in Pittsburgh and the worms it unearthed thriving on even centralized and restrictive platforms.
But censorship resistance cuts both ways. In China, for example, the government made every attempt to prevent students from sharing #MeToo blogs and stories of sexual abuse at Peking University earlier this year. The students responded by taking their stories to the Ethereum blockchain, effectively circumventing the government’s attempts at suppression. And it’s these sorts of use cases that resonate with blockchain true believers and advocates for decentralization.
In a world where fascism is a popular viewpoint, the belief that we should be building censorable infrastructure (whether technical or political) and handing the keys to governments/large corporations is incredibly dangerous.
— Sarah Jamie Lewis (@SarahJamieLewis) October 29, 2018
In terms of how users and developers can deal with the inevitable issue of unwanted, hateful, or even illegal content on decentralized platforms, Lopp says it comes down to implementing many of the same sort of “anti-spam and anti-harassment” tools currently in place on existing platforms, or creating new ones.
“Think of it like how existing networks have mute and block functionality that users can enable manually,” he says, “Existing networks also have algorithms that try to figure out which content is most interesting to you.” Lopp explains that these systems may require some sort of “machine learning” to build individualized profiles according to preference—the algorithm manually trained by each user to keep certain content out whenever unwanted material seeps through. “Making it easier for people to control the content to which they are exposed will be very important if censorship-resistant systems are to gain much adoption.”
The privacy advocate adds that it shouldn’t come as a shock if the early adopters of decentralized platforms arrive espousing “controversial views.” After all, these would likely be people who got the boot elsewhere, or split preemptively in search of place more tolerant of their points of view. “If racists and bigots can find a platform where people will expend the resources to host and propagate the data that they produce, they should be free to use that platform,” he says. And why not? Racism and bigotry, in itself, is no crime.
Ottman says his network intends to go by “the Daryl Davis philosophy, who confronted the KKK head-on as an African American man and, through open discourse, got hundreds of members to abandon it.” He says research backs this up this approach and believes strongly that “we need to evolve our strategy” for combating violent “hate speech.” In other words, the best cure for bad speech is good speech.
Besides, the First Amendment was designed to specifically protect unpopular ideas—indeed, popular opinions hardly need shielding. Hate mongers and provocateurs know this well and hide behind its protection, and yet it doesn’t make this fact any less true. Those inclined to seek dark urges have always known where to find them. “The Internet just makes it easier,” says Lopp. And while the decentralized Internet may make it easier still, technology only moves in one direction—the rest of society may soon be tested to do the same. Here’s to a better, filtered tomorrow.
Read Next: Protecting "bad" speech