This morning, Facebook co-founder Chris Hughes made a historic call to break Facebook at The New York Times . Hughes, who left the company in 2007, argues that Facebook has fostered the bad impulses of its users, prevented other companies from competing and obtained "unilateral control over speech" around the world. The piece is a blunt condemnation of Facebook's market power and a frightening control over modern society, and it's convincing.
But on one issue, his proposed solution is incomplete, confusing and even potentially counterproductive to his goal of reducing the power of Facebook. It is also frustratingly common.
Hughes is asking mainly for US regulators. UU Divide Facebook, WhatsApp and Instagram. However, towards the end of the article, it also suggests a new agency that would regulate technology companies. (So far, so good). Then, he suggests that this agency establish "guidelines for an acceptable discourse in social networks".
Finally, the agency should create guidelines for acceptable discourse on social networks. This idea may seem anti-American: we would never present a government agency that censored the speech. But we already have limits to shout "fire" in a theater full of people, child pornography, language intended to provoke violence and false statements to manipulate the prices of the shares. We will have to create similar standards that technology companies can use. These rules must, of course, be subject to review by the courts, as well as any other limit in the speech. But there is no constitutional right to harass others or live violence.
I'm not entirely sure what this paragraph means.
The First Amendment limits the power of the US government. UU To prohibit or criminalize certain types of discourse. There is a wide debate about how it applies to social networking platforms, but we're dealing with a specific issue here: keeping really harmful categories of speech off of platforms like Facebook. The point is that we already have "guidelines for an acceptable discourse" on the Internet, and they are the same as Hughes analyzes a sentence later. (Although "you can not shoot in a theater full of people" is not really a concrete legal doctrine).
Internet, like any new communications technology, can introduce new questions about freedom of expression, but saying something online does not do that. "T erasing existing laws in books.
In light of that, I assume that Hughes is saying: that the agency should create guidelines based on existing limits and then censor companies that do not remove offensive content from their services. general best practices, probably will not change the way that platforms operate, in fact, services like Facebook and Twitter already prohibit many speeches that are legal under the First Amendment.
But create a real policy with any bite, the agency would have to deal with Section 230 of the Communications Decency Law, Section 230 protects the owners of "interactive computer services" from the sponsibility about what other people post, so you can sue someone for writing a defamatory Facebook comment, but you can not sue Facebook for submitting that comment.
I intentionally avoid the term "web platforms" here, because at this point, either confuse people about what the law means, or allow them to lie. We have mentioned this before, but in Section 230, fundamentally it does not matter if you call a website a "platform" or an "editor". If a content is "provided by another information content" provider ", not created by the site operator, it is protected (with some exceptions). Social media companies can not be sued when someone leaves a vile comment on their website Neither do newspapers: As many Internet freedom advocates have argued, rejection of Section 230 would pose serious problems to a large number of websites, and would not necessarily keep content offline.
The mechanics of moderation is also a problem for Hughes' plan Certainly, Facebook does not want mass filming videos on its site at this time, but the moderation of content on the scale is intrinsically difficult, even for companies smaller than Facebook, and adding more legal responsibility will not change that basic problem. The government would not help either, even if we trust Hughes' new agency more than Zuckerberg. And remember, Facebook is one of the great and wealthy companies that are better equipped to perform this type of moderation. Smaller social networks, the kind Hughes hopes will flourish if we break Facebook, would face the same responsibilities with far fewer resources.