Yes. And it's also the purpose for other people to be able to tell them to shut the fuck up, you can't use my resources to spread that speech
Where's the line that separates censoring ideologies of bad faith (neo-nazism, racism, whatever Fox and co. are doing) and censoring things you don't like (conservatism if you're a liberal and vice versa, or any other view you don't share)?
The first amendment only forbids the government from punishing people for their speech and there are limits to that also. The Nazis are free to kick anyone off their website for any reason and Facebook can censor anything they want. We don't need to have a public discussion about what's permissible on private forums
Seems to me that your view descends from an opinion that naturally, a certain amount or kind of censorship is required. Correct me if I'm wrong: it sounds like what you're saying is "I believe measures should be taken to reduce the number of members from a particular group that choose to use the platform's capacity for public communication from said platform". Internet powerhouses — Google, Reddit, Facebook — disagree with your view. It's not to say that they're necessarily right or that you're necessarily wrong: the majority of an opinion has never been a reliable measure of its veracity on its own. I do believe, however, that they may have a point you're not acknowledging in your argument. The Internet has always been a manifestation of the idea of Freedom upon which the US mentality rests. Though controlled in a number of ways, it still professes far more freedom in terms of what one can do without repercussions within it. This includes both the good faith conduct (like arguing for human rights in a country that's known for revoking them), the bad faith conduct (like hate speech) and anything in between (like posting documents online that prove malevolent US activity on the foreign soil that led to innocent civilians being dead). With that kind of freedom, you're certain to get some sort of an ill conduct. A place where one could post about their feelings anonymously is also a place where a troll can run rampant. It's the price you pay for being free of consequence: some people would choose to do bad things, not because they're evil, but because they're human. In the real world, this shit would not bide well for anyone involved, because in the real world, you can feel the consequences quite well — and even then, people still kill other people. But the Internet? Sure, it has effect on the real world, and I hope you don't catch my tone as one of indifference. Things online can make or break you. Communication here is the ultimate Rorschach test: one tends to imbue the words they encounter with their own preferences and biases. But it's also a place where something good can be done with the freedom: expose fraud, help find the cure, inform others of injustice... Whether the good is worth the bad is another question. I don't think the question is as simple as "Shall we ban the Nazis?". Sure, they can ban you from their lot, and you can do the same to them. Maybe, by saying "We are not to judge", the powerhouses of the Internet make a good point. (though, to be fair, Reddit and Google are both known to be selective about what they choose not to judge: the waves of hatred subreddit bans and the recent "ad-friendly" YouTube debacle are both proof. it's a known point that's been made before, here as well, by people far more educated and eloquent than myself. I just want to show that there may be some merit to the other side — something other than "They're no-gooders who can't get their screws straight")
Yes, we are to judge Nazis. We fought a war over it. Google is different. They provide a service and should tend towards neutrality. Reddit and Facebook are not a service, they are a communication platform more like a TV network. Google is like a phone book. So yes, if Reddit and Facebook don't want to be overrun by trolls they have to put limits in place. This is not censorship anymore than it's censorship for ABC not to pick up a pilot for a TV show. It's their platform and so you play by their rules. Unlimited free speech is not a guarantee and the people who think it is either don't understand it or are abusing it on purpose to rally to the cause of posting pictures of dead babies and pictures of women's underpants taken on subways. These sites need to enforce decency like any other media company that wasn't founded on a misunderstanding of the libertarian ideals of the early internet that were about making software available so IBM didn't own every damn thing.
Which returns to an earlier question of mine: where's that line? What's the measure to draw it with? You said earlier: Sure. Suppose there's a corporate coup tomorrow on, let's say, Reddit, and there are people in charge who hold up similar views to yours... except, they either go over what you'd consider an ideal decency level, and ban stuff that, to you, seems perfectly fine. Would you consider it a victory for decency, or would you be upset with the new admins taking things too far? What if they do it just for your liking, but someone else — not a small group of users — would count it as an overkill? Sure, it's their ball, so they get to call what's good and what's not. That's what they're already doing. It may seem like law-abiding yet complacent indifference, but it's one of the choices the execs have made about the structure of the platform. Whether you consider it a righteous choice or an application of misunderstood libertarian ideals from days gone by, that's the ground state of the platform as long as the execs are fine with it. With that in mind, why are you not playing by their rules? Why do you think your vision of their structure of the platform is better than that of the people in charge? I think I can anticipate some of your response. I think you're going to say something about how there's an appropriate behavior and there's an inappropriate behavior, and the platforms in question should keep the latter in check for the good of their users. What if the chaotic nature of the platform is the goal? What if, again, the pictures of dead babies (as revolting as the thought is) is the price you pay for allowing for creativity and discussion to thrive? It's not a coincidence that it was on Reddit where /r/wholesomememes or /r/AskHistorians appeared. The same place that gives space to the far-right activists who can't withstand a kernel of criticism also gave us a community of some of the nicest people on the Internet, as well as a notoriously moderated place to ask serious historical questions and expect a thorough, scientific reply. Each found themselves amidst the chaos because they had the conditions to grow and thrive as they are. Again, I don't think it's as simple a question as "Shall we ban Nazis?". I don't think it's even about Nazis. As appealing as they are a target (and sure as hell an easy one), it's a conversation about regulations, and regulations are a matter of opinion. One could make a case for said dead baby pictures being an invaluable asset in one's drawer (I'm not going to, but one could). As far as I'm aware, Google, Reddit et al. already begrudgingly obey the law, and that's as much as one can legally ask. Beyond that, the ethics of allowed content become a matter of preference. Also... No, they don't. They could, and many feel that they should, but they work well enough as it is, and by giving moral authority to one source, they inevitably upset everybody else. As far as their business model remains "more views equals more profit", they're doing just fine in status quo.These sites need to enforce decency like any other media company
We don't need to have a public discussion about what's permissible on private forums
These sites need to enforce decency
"Freedom of speech" is the philosophy that no one can silence your thoughts or ideas (so long as those thoughts or ideas do not impinge upon the freedom of speech of others: "I think everyone should buy guns and shoot my neighbor Joe he's a child molester and a blood-sucking incubus" is not protected speech under American law). "Corporate responsibility" is the philosophy that a commercial endeavor should not act in opposition to the basic beliefs of its stakeholders (above and beyond profitability and value: Union Carbide might have helped their revenues through slack maintenance schedules, but they also killed two thousand people.) If the issue were freedom of speech, the rank-and-file activists of society would be yammering for indictments. The issue is corporate responsibility: no, there's no law against Reddit hosting nazis. But yes, hosting nazis is fucking terrible and hiding behind "well, but they're not breaking the law" only serves to encourage the creation of bad law.
I'm slightly more sympathetic to Reddit. We all want to think of the Internet as it was in ye olde days when John Gilmore was first getting cranky and no single entity was really in charge and the trouble with shutting up anyone was that someone would have to be in charge first. The Internet we've built isn't the Internet we want though, Reddit is absolutely in charge of Reddit and pretending otherwise is just dodging responsibility. But I'm perfectly happy to assume they never wanted that responsibility, they just wanted their wild west and to collect the ad revenue too. I'd prefer having the actual wild west Internet back, but the railroads without the sheriffs is the worst of both worlds.
That's because you didn't spend hours at a time, year after year, patiently explaining to one "community manager" after another that ease-of-use and ease-of-sharing were vulnerabilities every bit as bad as ActiveX and that if they did not choose to promote the content they valued and discourage the content they reviled, they would end up with whatever content the most passionate and antagonistic community wanted. SomethingAwful basically built out the Archangelles to demonstrate how sensitive to subversion Reddit was and Reddit did nothing. I've got chat logs going back to 2007 wherein I argue with one or the other of those naifs that unless they figured out a way to steer the conversation away from toxicity, they'd end up with a toxic swamp. And now they have a toxic swamp. And they have no idea what to do. No. No we don't. A thousand times no. Ye Olde Days was populated by IT professionals and college students. Kim Kardashian has 105,000,000 Instagram followers. not a one of them is suited to survive in the internet environment of Ye Olde Days. This isn't fucking Pangea anymore and a lot of that is the negligence of shitheads like the guys who run Reddit.I'm slightly more sympathetic to Reddit.
But I'm perfectly happy to assume they never wanted that responsibility, they just wanted their wild west and to collect the ad revenue too.
We all want to think of the Internet as it was in ye olde days when John Gilmore was first getting cranky and no single entity was really in charge and the trouble with shutting up anyone was that someone would have to be in charge first.