When I was 10 years old and first introduced to the miracle of the World Wide Web, chat rooms were by far my favorite thing. Talking to random people from all over the world about anything you want — what more could a bored kid ask for?
I’d spend hours in these chat rooms, asking my new friends how old they were, what they had for breakfast, and how much pocket money their parents gave them. I shared this experience with a friend who didn’t own a computer and had never used the internet.
He asked if he could have a go. “Sure!” I said, excited for him to experience the wonder of the internet. Without hesitation, he began typing the worst insults and swear words he could think of. Horrified I had awoken a dark and malevolent force, and fearing he had forever ruined my friendship with strawberry88, I shut down my computer and didn’t invite him to play on the internet again.
To this day, I remain baffled by this behavior. When faced with the endless possibility of the internet, my childhood friend’s first impulse was to verbally abuse strangers. This innocent 10-year-old had become a troll.
John Oliver once described the internet as a “dark carnival of humanity’s most wretched impulses.” Was it these wretched impulses that had consumed my childhood friend?
The act of trolling is best described from where its name may have come from — the form of fishing where a lure is dangled off a moving boat.
The troll casts his bait (the offensive comment) into the water of the internet. An unsuspecting fish (the targeted user) sees the bait and feels compelled to go for it (the defensive comment). Soon they are hooked and reeled in without mercy. But unlike trolling for fish, which delivers a clear and edible reward, the troll’s reward isn’t entirely clear.
Trolling is a hard concept to define because there are various methods of trolling and differing degrees of depravity. Some are abhorrent, like “suicide baiting,” where trolls encourage vulnerable users to kill themselves, or “RIP trolls” who vandalize Facebook memorial sites of the recently deceased. But others, like “griefers” who play online games in a manner that purposely disrupts other players, are more of a nuisance.
Who are these trolls, and what drives them?
The dark tetrad
Psychologists have found a link between trollish behavior and a set of personality traits called “the dark tetrad.”
The dark tetrad comprises:
- Sadism — deriving pleasure from another’s pain
- Psychopathy — impairment of empathy and remorse
- Machiavellianism — manipulative and emotionally “cold” behavior
- Narcissism — self-involvement and a need for admiration
In a recent study, trolls were positively correlated with three of the four dark tetrad traits, with narcissism being the odd one out. They found trolls were manipulative, lacked empathy, and enjoyed hurting others. Men exhibited these traits more commonly than women and were more likely to troll. Loneliness was also a significant predictor of trolling when in the presence of Machiavellianism or psychopathy.
Most studies on trolls use internet surveys to collect data, which is questionable: Can we really trust trolls to complete surveys accurately? This method may also not account for those who don’t consider their behavior to be trollish or those unaware of their trollish behavior.
In the book Troll Hunting, journalist Ginger Gorman spends years building relationships with the worst trolls she can find in an attempt to understand what drives them. To her surprise, trolls were not uneducated lost souls who lacked social skills and lived in their mother’s basement. These trolls had partners, children, and full-time jobs. They showed leadership skills as commanders of large trolling syndicates. They were socially intelligent and able to pinpoint users’ weaknesses with vicious precision. But what was driving them?
Many saw trolling as a hobby — something that entertained or amused. Some were ideologically driven, attacking anybody opposing their belief system. But both types of troll tended to engage users that threatened their beliefs or sense of self.
Some of the trolls exhibited dark tetrad traits. In these trolls, she saw a common pattern — excessive internet use with little to no parental supervision between the ages of 11 and 16. But some trolls didn’t fit the dark tetrad personality type. These trolls were pleasant, friendly, and compassionate when she engaged them directly. How could these trolls behave so antisocially online yet appear to function as typical members of society offline?
The human brain was primarily designed for face-to-face interaction. It hasn’t had time to adapt to communication over the internet.
Nonverbal communication — facial expressions, gestures, and voice qualities — provides the precise social context of an interaction. While the claim that 93% of communication being nonverbal is inaccurate, it is a crucial part of how we communicate. Words alone can only go so far. Even if we used the full 170,000 words currently in use in the English language, we still couldn’t convey what an expressive face or a suggestive voice could.
Most internet discussions only allow words. Well, words and emojis and GIFs and stickers and all the other substitutes created to replace nonverbal cues.
If you say something mean to my face and make me cry, you will probably start to feel uncomfortable. Unless you’re especially mean or psychopathic, my distress will trigger an empathic response and lead you to have mercy. If you tweet something mean and make me cry, no amount of emojis can convey what the sight of a grown man weeping can. If there is no social cue to elicit an empathic response, you might continue your tirade of meanness.
The absence of nonverbal feedback leads to an “empathy deficit,” and this is what sociopaths suffer from.
When you combine an empathy deficit with the anonymity of online interactions, you get “toxic disinhibition,” which is more than just the phenomenon of being rude to bar staff after that fifth shot of tequila.
Anonymity can lead to “deindividuation” — a temporary loss of one’s identity leading to behavior incongruent with one’s character. It explains why groups of civilized people can engage in riots. It also explains trolling. If a lack of nonverbal cues is what makes us detached from the other person’s suffering, deindividuation is what makes us detached from the awareness of our misconduct.
True anonymity offers protection from real-world social repercussions, and this has profound effects on human behavior. The image-based bulletin board 4chan, where registration isn’t possible and users remain anonymous, has been infamous as a troll incubator for this reason. When there are no real-world consequences to your actions, it liberates you from a lifetime of societally inhibited behaviors. Society discourages antisocial behavior and encourages prosocial behavior, so it is antisocial behavior that seeks liberation.
We are a delicate balance between prosocial humans and antisocial primates. When society cannot enforce prosocial human behavior, the antisocial primate may come back into power. And thus the troll is created.
Troll begets troll
Researchers at Stanford and Cornell universities performed a large-scale data analysis on over 16 million comments from December 2012 to August 2013 on CNN.com and found 1 in 4 posts flagged as abusive were from users with no prior record of trollish behavior. This suggests trolling isn’t always a full-time occupation and that one may indulge sporadically.
The researchers could predict the likelihood of trolling based on the nature of other comments in the discussion and the user’s mood. If earlier comments were negative, the propensity to troll was greater. Like a bad mood, trolling is contagious. All it takes is another user’s trollish comment and a bad mood to create an environment in which our inner troll can blossom.
The inner troll
It is easier to view trolls as bad apples than see them as something inside all of us, waiting for the right environment to let loose. But when we condemn trolls as inherently malicious individuals, we limit our understanding of what may drive these behaviors.
While RIP trolls or suicide baiters are likely to be dark tetrad personality types who use the internet as an outlet to indulge their darkest impulses, lesser trolls may be part-time participants who will engage with the right combination of a bad day and a noxious environment. We have only begun to scratch the surface in our understanding of trolls, but the evidence we have suggests we may all be vulnerable.
Is anyone exempt from toxic disinhibition? Few respond to a tweet that offends them with “Excuse me, I really don’t want to be rude, but if I may could I please respectfully disagree with your opinion for these reasons …” While an offhand remark may appear harmless, the less empathic our online interactions collectively become, the greater risk we all stand of becoming trolls. The gentle ripples of impolite tweets may become crashing toxic waves of disinhibited hatred.
Trolling isn’t black and white, it is somewhere in the grey between prosocial human and antisocial primate. Ultimately, our propensity for antisocial behavior in the physical world is likely to predict similar online behavior.
How can we manage our inner trolls?
The more accountable we are for our behavior, the less potential we have of becoming trolls. Employing less anonymity may help, but this raises privacy concerns for many. Anonymity can also be a good thing. Benign disinhibition — the friendly sibling of toxic disinhibition — is where users freely discuss their deepest insecurities and concerns with other users. This can be very therapeutic and shouldn’t be discouraged. But by using anonymity only where it is necessary, we reduce the likelihood of toxic disinhibition.
Empathy doesn’t come naturally to internet-based interactions, so we must consciously foster it. If we remember online comments are received without any nonverbal cues, we can better ensure our words are not misconstrued. This will translate to less chance of inciting ill will from other users.
Rather than taking a rude comment from another user at face value, we could take a moment and consider this person may not have meant to convey rudeness. If they did, perhaps they have had a bad day and their attitude is a reflection of this, rather than anything personal. Compassion can limit the harm one bad-mannered comment can inflict.
Awareness of how we respond to distasteful comments can create space between us and our behavior. In this space, we may consciously decide a cyber-altercation is not what we want. We may also recognize the times we are vulnerable. For instance, after a bad day at work, our online comments may become ill-mannered and less congruent with who we really are. Understanding what causes the antisocial primate to act up is the key to managing it — the biggest mistake we can all make is to deny its existence.
Certain platforms may encourage disinhibited behavior, so moving discussions elsewhere may help. Kialo, a novel online discussion platform that has tried to civilize internet-based debate, is an excellent option. Its very design is to educate, encourage debate, and improve users’ understanding of argumentation. It does this by separating each argument into pros and cons, then allows users to vote on each. A user might start a debate on “Should vaccines be mandatory?” and post an argument for mandatory vaccines. Other users then post arguments for or against, and soon you have a detailed visual map of the entire argument. Admins moderate all comments to ensure no one is becoming disinhibited. The result is a constructive debate and a prosocial collaboration of human ideas.
The internet allows us to share ideas at scale. In doing so, it reveals all elements of what it means to be human — the good, the bad, and the ugly troll. If we can address our potential for toxic disinhibition on an individual level, we have the power to collectively transform a “dark carnival” of antisocial primates, into a “brilliant banquet” of prosocial humans. I look forward to the day when we respond to a rude Tweet with “let’s take this to Kialo.”