Harvard was right to crack down on racist memes. But Facebook should’ve done it first

Harvard was right to crack down on racist memes. But Facebook should’ve done it first


At least 10 would-be Harvard students had their admission revoked because they shared offensive memes on Facebook. The memes used jokes about racial slurs, sexual assault, and child abuse, and when they were discovered by administrators, Harvard rescinded their admissions offers.
I’ve been posting on internet forums since I was a preteen. It’s easy to look at Harvard’s punishment as being too harsh on a bunch of kids sharing things online, but there is more to this story than just kids sharing memes.
Harvard represents for these kids the first real-world consequence of posting content that is racist, sexist, and violent. When this kind of behavior is left unchecked, it can lead to bad places — harassment, normalizing offensive language, and radicalization.
It also highlights the fact that today’s internet is sorely missing content moderation, something prioritized by the message boards where memes got their start. Compared to today’s social media, the admins of early forums like Something Awful drew much clearer lines in the sand to shut down offensive posts. But for Facebook and Twitter today, the techno-libertarian ethos of free speech has allowed hateful content to flourish.

That means for some posters, it ends up falling to outside institutions like Harvard to step in to stop this behavior.

Why proper moderation is key to stopping hate speech online

On the internet, the attitude of sharing hateful content and dismissing it as “just memes” is pervasive. Racist and sexist memes fill the same space that racist and sexist jokes do. After all, a meme is just a joke in image format. And just like someone telling a hateful joke and freeing up space for these attitudes, these memes make offensive things seem normal.
Once you’ve normalized all kinds of vile speech, it becomes easy to make the jumpto full-on hate speech and white nationalism. This is how the alt-right got a foothold in 4chan, and this is the way some people end up on the other side. When nobody’s there to tell you it’s wrong and when the entirety of your life comes through online interaction, the only people who can steer you are the ones you talk to online.
This is why it is absolutely crucial for social media to properly moderate what goes on between users. If you didn’t grow up knowing the differences between these kinds of proto-social media, it becomes easy to get tricked into thinking it’s a free speech issue. In reality, it’s about keeping hate speech from quietly taking root. Otherwise you’ll see more situations like Harvard, where an external entity gets a glimpse in and is shocked by just how far it has gone.

I grew up posting on forums. I understand this kind of culture.

My personal story is instructive here: While I never participated in the darkest corners of the web, the culture where racist and violent memes are shared is familiar to me. Richard Spencer, the alt-right leader famous for being punched in the face, said, “The average alt-right-ist is probably a 28-year-old tech-savvy guy working in IT.” Alt-right affiliation aside, this describes me to a T. The difference is, I ended up a Bernie Sanders-supporting, Chapo Trap House-listening urbanite. I've always wondered what put me on that path. I think it has a lot to do with the culture of the places I posted online — and, more importantly, the places I avoided.
I started posting online in the pre-social media days. Back in the early 2000s when I was a preteen, I violated the 13-year-old age restriction to post on message boards. It was different back then. You would join a forum that would align with your interests, which usually centered on pretty dorky topics like video games or Dragon Ball Z, and would slowly migrate to off-topic forums to goof off about anything else.
You would build this online personality for yourself through jokes, images, and arguing in debates. It was early enough that nobody knew what a meme was — they were just funny pictures that you’d save in a folder on your computer and post elsewhere.
There was a hierarchy to those boards — you’d have an administrator that would run higher-level things, and then moderators who would handle the day-to-day of making sure people posting didn’t violate the rules of the forum. Those moderators made sure the pot didn’t boil over into vile things, because the internet has always been an awful place.
There was one of these forums, a place called Something Awful, that became kind of a beacon for all of these other message boards. Two things set Something Awful apart — the quality of the content (I can say with certainty that you’ve laughed at memes from Something Awful, where LOLcats and Slender Man found their start), and the fact that in order to post, you had to pay $10.
It was an excellent community that was maintained by a large number of moderators of varied backgrounds that had the ability to back up the rules about quality of content with a cost of $10 to re-register if you got banned. This meant that if the moderation team wanted to crack down on content, they could. It also meant that posters had to keep things relatively civil.
This also led to too restrictive of an environment for some posters, and one day a young man who posted on a subforum called Anime Death Tentacle Rape Whorehouse under the name “moot” decided to create a new forum modeled off a Japanese site called 2chan, free form these moderating forces. He named his new site 4chan.
I was always afraid of 4chan. Even before 4chan was known for sprouting Anonymous, the morally dubious group of politically active hackers, and for the rise of the alt-right, 4chan was a place where you could go and be instantly greeted with something unspeakably offensive. It got its start as an anime forum, but quickly grew into a thicket of misogyny and racism, fed by a laissez-faire moderation approach where anything that wasn’t explicitly illegal was allowed. Anonymity and a snarling user base full of bullying, threats of violence, and encouragement of self-harm were the norm. If Something Awful was a governed online city-state, 4chan users were the barbarians at the gates.
What happened with the teens at Harvard happened on Facebook, but its roots are in 4chan. I forget what it’s like to be a teenager sometimes, but thinking about this case made me go back and remember just how much of being a teenage boy is pushing the envelope to feel rebellious. It’s how you push those boundaries to see when they give, and being online gives you the ability to push them very, very far.
The problem becomes that the internet can be an echo chamber. You curate whom you follow based off your tastes, and that can give the false impression that everything you’re doing is perfectly fine, even as you swim out into darker and more dangerous waters. You keep pushing to be edgier, and everyone else seems to be encouraging you. That’s the rabid groupthink that keeps 4chan going.

No comments: