Questions about policing online hate are much bigger than Facebook and YouTube

Following a massive hate-driven shootout in Christchurch, New Zealand, major web platforms rushed to shoot down a video of the 17-minute attack. Sites like YouTube have applied imperfect technical solutions, trying to draw a line between journalistic and unacceptable uses of images.

But Facebook, Google and Twitter are not the only places that analyze how to handle violent extremism. And traditional moderation does not affect smaller sites where people are still promoting the video or praising the shooter. Somehow, these sites pose a more difficult problem, and their destiny is much closer to fundamental questions about how to monitor the web. After all, for years, people have praised the Internet's ability to connect people, share information and direct censorship. With the Christchurch shot, we are seeing that phenomenon in its utmost darkness.

The Christchurch shooter broadcast live video on Facebook and posted it on other platforms, but its distribution center Apparently it was 8chan, the board community of images whose members frequently promote extreme right extremism. 8chan had already been pulled from the Google Search listings and had started at least one hosting service for problems with child pornography. (The owner of 8chan claims that the site removes "vigorously" child pornography). After the shot, some users posted comments that speculated that the site would be removed. Forbes later raised the issue of closing 8chan, and in New Zealand, Internet service providers really blocked it and a handful of other sites.

In the last two years there has been a wave of misinformation. Far-right sites, with payment processors, domain registrars, hosting companies and other infrastructure providers that withdraw support. This practice has displaced collective funding sites such as Hatreon and MakerSupport, and has temporarily removed the Gab social network and the white supremacist blog The Daily Stormer offline.

Companies that are not traditional social networks still have systems for washing Objectionable content. A user on the 8chan subreddit pointed readers to a Dropbox link with the video, but a Dropbox spokesperson told The Verge that he is deleting these videos as they are published, using a system Scanning similar to the one used for

However, it is difficult to remove a site permanently, thanks to the large number of companies offering these services, an element from the open web that is generally considered a good thing by the way it eliminates traditional goalkeepers. The Daily Stormer came back online after several bans, and Gab received public support from a Seattle-based domain registrar. There are also decentralized protocols designed specifically to keep content online. As of this afternoon, the troll haven Kiwi Farms was linking to a BitTorrent file of the video, something that does not require hosting in any type of central platform.

Infrastructure companies may be more reluctant to get involved with the content policy than Facebook or Twitter. Cloudflare, which helps protect sites against denial of service attacks, has explicitly adopted a non-intervention approach. "We see ourselves as an internet infrastructure company. We are not a content company. We do not run a platform, we do not create content, we do not suggest it or we model it. And so, to a large extent, we consider our point of view as one driven by neutrality, "says Douglas Kramer, Cloudflare's general counsel.

Kramer compares Cloudflare's police content to a truck driver who makes editorial decisions about what a newspaper prints before transporting it Cloudflare complies with court orders and will not deal with companies on official sanctions lists In a high profile incident, the company also banned The Daily Stormer for suggesting that Cloudflare he had endorsed his white supremacist ideology and harassed critics who presented allegations of abuse. "They were quite unique in their behavior," adds Kramer, and the company has not treated a similar case since.

However, in a A breakdown of its policies published last month, Cloudflare urged countries to develop mechanisms to fight. IƱane: arguing that despite concerns about the preservation of freedom of expression and due process, governments have a kind of legitimacy that web platforms that make unilateral decisions do not have.

Even without new laws, we could see blocks across the country in 8chan and similar sites. But that would be an extreme measure that would give ISPs or governments a lot of power on the Internet. (In the US, Verizon briefly banned the predecessor of 8chan 4chan in 2010, but it was supposedly related to a network attack, not the content of 4chan). There is a big gap between getting someone to leave Twitter or Facebook and start their own website, and exercise control. Especially what can be seen or published on the web.

And it's possible that a site like 8chan is mostly in quarantine if the larger social networking networks delete the links from official sites and accounts, making it more difficult to get traffic as a recent impulse from the study of THQ games Nordic, who promoted an 8-channel AMA on their Twitter account last month. That would be controversial, but much less than trying to completely remove a site or information offline, especially without having some difficult conversations about how we want the Internet to work

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.