I had to do a double take when the post flashed on my feed. It was a post filled with armchair bloodlust, calling for war and justifying it through emotional bulls***. In many ways, it wasn’t shocking, because in its misdirected anger and emotional patriotism, it mimicked the charged nature of conversations that we have naturalised on the social web. It also followed the familiar paths of writing about action — from the safety and comfort of a sheltered life, where it is clear that the people sharing it would never have to participate in the war that they are baying for, and that even the destructive aftermath of the war would not interrupt their latte lifestyles.
It was clearly authored by one of those social media savants who indulge in random acts of capitalisation, which give you a brain rash. It did not even claim to be factual — the excesses of exclamation points were supposed to make up both for the hate speech and xenophobia that were being couched as nationalism.
This time, though, the post came from an unexpected source. It was shared in a group that generally has rational, fairly academics and measured discussions about the politics of everyday life. In the past, the most offensive thing anybody had done in a disagreement was to make threatening cat memes. And yet, here was a post that had the community howling for violence and fighting among each other with a vitriol that they would have generally decried and derided.
Unable to understand this completely unexpected behaviour, I started pinging a few familiar people through private messages, asking them why they were deviating into uncharacteristic behaviour. In a dozen different conversations, one thing that everybody talked about was how they did not begin with this emotional state when they heard the first susurrations of war. They all shared that their first reaction to the portents of war was cautious concern and a thoughtful contemplation of its consequences. However, somewhere between that first reaction and now, something obviously had switched. They had gone from people wanting to think about the possibilities of war to mobs who were supporting rabid and radical calls for action not grounded in anything more than an emotional excess.
Their emotional state, they were saying, was not their own, but was something they learned as they were bombarded with incessant torrents of similar posts that valorised, championed and positioned war as the only option available. At some time in their information overload, facts, truths, thoughtfulness and critique all disappeared and they got sucked into a viral sharing habit where they inherited the anger, the hate, and the militarised trolling that flooded their timelines.
When we talk of information overload and the constant engagement with social media streams, we often talk about people doing strange things, which they would not do in real life — if there is a real life that can be separated from the digital domains. Especially when looking at gender-based violence, non-consensual distribution of sexual content, and cyber-bullying, the perpetrators often find themselves in a state of shock when confronted personally with their actions and their consequences. Many people, swept in the fashions of the digital delirium, begin their confessions in a state of denial: “This is not who I am… I just lost control”, is a common refrain.
Researchers have pointed out that one of the most dramatic effects of information saturation is the suspension of emotional guards and affective patterns. Information overload sometimes leaves the subject in an emotional state that resembles victims of mental abuse. It leads to such a state of stress and tension that many people just give in to the onslaught of information and follow the patterns rather than resisting or questioning it. Continued sharing and circulation makes our emotional judgement fickle, and we often act against our impulses.
Algorithms of manipulation, coordinated bot attacks, and commissioned troll campaigns exploit this, because this emotional state is one that can be easily controlled — towards making political choices, buying things we don’t want, towards attacking people, communities, countries. It is time to realise that our sharing is not just about our own impulses and ideas. We are continuously being nudged and taught to inherit an emotional state that is being engineered in the circuits of the social web. So the next time you share something, pause, and think about whether this is what you want to say or this is what you are being trained to say, because what we say and share has consequences, often beyond that quick click.
Nishant Shah is a professor of new media and the co-founder of The Centre for Internet & Society, Bengaluru.
This article appeared in print with the headline ‘Digital Native: Monsters, Unchained’