He zucked up.

Asked this week to explain why hoax site Infowars was still allowed to post to Facebook, CEO Mark Zuckerberg told Recode's Kara Swisher that the company had to give users leeway because it's difficult to impugn intent. Things took a left turn, however, when Zuckerberg used Holocaust deniers as an example, suggesting that it wasn't the platform's place to determine whether such people really intend to mislead. The blowback was fierce and immediate enough that the Facebook CEO was forced to issue a clarification only a few hours after his interview was published.

Yet, poor examples aside, this really is what the titans of tech seem to believe. And it is in some ways easy to understand. Facebook, Twitter, and other companies are simply adhering to what they see as the basic ideals of liberal democracy: fairness, equality, and free speech. As Zuckerberg stated in his clarification, "I believe that often the best way to fight offensive bad speech is with good speech."

That same commitment to liberal ideals is also precisely what has helped tech platforms become the battle ground for a new sort of "culture war" — a Trump-era conflict that pitches progressives against those who think that concepts like racial and gender diversity represent a threat to established norms. In taking the stance that the Enlightenment values of neutrality and universalism are fit to police digital networks, tech is helping to encode the culture wars into their very DNA.

The most obvious problem is that the decision to leave up false, misleading, or outright incendiary posts allows the system to be gamed. As BuzzFeed's Charlie Warzel has convincingly argued, Facebook seems to misunderstand how its platform is deliberately misused by bad faith actors who, for example, post false, inflammatory information and then remove it after the damage is already done. By dodging shifting standards for what can and cannot be taken down, those looking to spread misinformation can do considerable harm before having their content removed.

Yet there's also something deeper at work. It is no coincidence the rise of social networks has coincided with so-called "both-sidesism," in which equal time and weight is given to opposite sides of a debate no matter how abhorrent or absurd one view might be. The most obvious example here would be President Trump's "there were very good people on both sides" comment after the white supremacist march and terrorist attack in Charlottesville. But the trend that has seen men's rights groups or racists take on the language of oppression points to the way in which a neutral or universal approach to content ends up fostering a climate in which patently awful things are talked about as if they were no different from the ordinary.

To be clear, the idea that a private company should get to determine what is true or right, especially when a company like Facebook is immensely popular, is deeply disturbing. While one might breezily claim a site like Infowars should be banned, there are countless edge cases that are less clear cut.

But as New York's Max Read points out, relying on notions of free speech and equality — key ideas to liberal democracies — are at best hypocritical when the checks and balances of a representative government are missing from private companies like Facebook and Twitter. In that regard, social media companies are more like dictatorships. In outsourcing the infrastructure for public discourse to a handful of companies on America's West Coast, we have also given up the ability to have a say through voting, policy, or other forms of pressure because we've allowed these organizations to take on a state-like function.

So what can be done? Read suggests that Facebook produce a kind of constitution to at least make the process of content removal transparent and consistent. There is also the more aggressive option of regulation, which would necessitate new laws to deal with the specificity of digital networks. More extreme would be actually breaking up these companies in order to mitigate how their scale helps produce these negative effects; considering the current regulatory climate, this option seems the least likely.

But as is clear in the world of fake news, the potential return of neo-fascism, and an increasingly polarized, manipulated public sphere, perhaps old principles are no longer as reliable as they once were, and that instead, we need to find a way to insist digital networks take responsibility for the content on their platforms.

In not doing so, Facebook, Twitter are more are simply encoding the culture wars into their DNA — and we are all worse off for it.