As censorship increases, so too does terrorism. Lessons from a mind hacker

Untitled design.png
22 March 2022

While the world is watching the war in Ukraine unfold in shock, analysts and experts point out that we've been in one for quite some time. In hindsight, the (mis)information war spearheaded by Kremlin certainly attempted to lay the groundwork for this heinous invasion. As the EU bans Russian media and tries to deal with pro-Putin trolls, we're revisiting Justin Lane's talk from last Reflect. Justin's company ALAN Analytics is in the business of understanding and predicting human beliefs and subsequent behavior. He's tackling hate speech and extremism on the regular, and his message from the festival is perhaps even more relevant today. 

Overwhelmed brain

Do you ever use Twitter and Facebook and angrily close the app, exhausted from the hate pouring out from the comments? “We know that something is not right, that we're not always happy when we use it. We see other people getting in fights with friends and family that otherwise wouldn't,” Justin describes. The fact that algorithms like to promote sensations certainly doesn't help, inciting anger and violence even when sitting in front of your computer. 

As Justin explained at Reflect, our brains were not designed to process so much information from multiple sources. “They evolved in an environment where we spoke to only a few sources of information a month, mostly family,” he says. But today, they're swamped with information about events happening somewhere halfway across the world, shaping our beliefs. 

“We live in a world where our cell phones, smartwatches, our computers are feeding us information. They're making us an encapsulated society of one. This is why social media is so social, but also lonely,”  Justin describes. That leads to a creation of false reality: a state of mind where we're consuming too much information that we're not directly experiencing. Our perception shifts and we are stuck in our own bubble. 

Violence vs. democracy

Justin pointed out an interesting paradox. We tend to exaggerate the level of violence around the world, when in fact, it's been going down for decades. Hand in hand with democracy and free speech. So while all that negativity you see on your feed is overwhelming, social media doesn't seem to create more extremism - it just looks that way. 

“As censorship increases, so too does terrorism,” Justin warns, adding that extremists don't disappear, they just move elsewhere. “These people still exist, they're still having the conversations, but now they're having them with themselves. We're creating our own echo chambers. We're giving people with extremist views their own environment to live in that already suffers from a false sense of reality, so we just make it worse by saying - don't share our reality anymore, don't share our stable sense of collective understanding of what's going on in the world. In here, it gets to breed and stew.”

We see that happening even now, as pro-Kremlin trolls move to Telegram to discuss with like-minded people. According to Justin, people isolated in echo chambers reinforce what they already believe and completely identify with their ideology. “In psychology, we know identity fusion - you can't separate yourself from the ideas you believe in. An attack on the idea then becomes an attack on yourself.”

From censorship to better messaging

Since social media is here to stay, our mission should be to create and use better messaging instead. And that means less sensation, more personalization. “We have to stop relying on sensationalization, negativity, and emotion to drive the engagement on these platforms. Instead, you have to rely on meeting people where they are. Having a conversation, not a monologue,” Justin says. This so-called positive resonance lets companies create on-brand messaging and address specific communities effectively - something already ongoing on Reddit.

“If you can identify and predict extremism, you can help to fight it. We can create deradicalization platforms.”

Watch Justin's full talk on our YouTube:


Justin E. Lane is the Co-Founder and CTO at ALAN Analytics, a consultancy, and AI technology firm based in Slovakia that focuses on the development of advanced forms of AI for social prediction. Its flagship project is a SaaS platform that lets you experiment with digital twins of your online social media market and predict what will “catch on”. Justin received his DPhil from the University of Oxford’s Institute for Cognitive and Evolutionary Anthropology, where he pioneered new approaches using AI to study human cultures a large scale. He has experience in consulting for AI and simulation design for both university projects and private industry and has led or helped develop grants with an awarded value of over $85 million (USD). His work has helped pioneer computational methods for studying social groups and online social networks. His research projects in AI, simulation and natural language processing (NLP) have been covered globally in outlets such as BBC News, Vice News, The Atlantic, New Scientist, and The Telegraph, and he has been a featured speaker at research, technology, and policy conferences in North America, Asia, and Europe. He’s also the author of the book Understanding Religion through Artificial Intelligence, published by Bloomsbury.

Join us on Instagram with