51ºF

KSAT Explains: How conspiracies like QAnon go from fringes to forefront of social media

Episode 24 examines how conspiracy theories make it to the mainstream

SAN ANTONIO – An increase in political polarization. Mistrust in institutions. A huge megaphone provided by the internet. All of these factors have created an environment for conspiracies to run rampant.

Far flung and false ideas are no longer hidden in the far corners of the Internet. They are now discussed in mainstream media because belief in them has become so widespread.

SMART. IN-DEPTH. LOCAL: Click here for more episodes of KSAT Explains

In this episode of KSAT Explains, we took a closer look at this phenomenon underway across the United States and the recent surge of conspiracy theories. (Watch the full episode in the video player above.)

QAnon becomes mainstream

QAnon started with a simple post on an anonymous 4chan message board in 2017. It now has supporters from across the country.

“4chan is kind of a place where everyone’s anonymous. It’s like one of these free speech zones that just tolerates all sorts of hate speech and horrible kind of dirty memes and that kind of thing,” said Drew Harwell, Washington Post tech reporter. “At the time, back in 2017, it was sort of promising. It had this secret intelligence about Hillary Clinton being arrested for a lot of the same totally bogus claims that you had seen with Pizzagate and some of these other conspiracy theories.”

Those theories gained a national following that grew into what QAnon is today.

“QAnon on is a big bundle of sprawling conspiracy theories that are totally extreme and totally devoid of evidence, but what they say is that there’s this secret holy war being played out behind the scenes where there’s this cabal of satan worshiping, child trafficking, bad guys who also control the U.S. government, control the deep state, control the media, control the world,” said Harwell.

The sprawling conspiracy is now widespread. It’s being discussed on major social media platforms and in every day life.

“A lot of people who imagined a conspiracy theory think these are a bunch of lowlifes living in the basement, no social interaction, no job, that kind of thing. It’s a really easy stereotype to go to. It doesn’t seem to be true in this case. I’ve talked to men, women, college graduates, not college graduates, rural, urban. QAnon crosses all of those boundaries,” said Harwell.

In the video below, we look at QAnon’s impact on the presidential election, how tech companies were slow to react to its appeal and its growth out of the shadows of social media.

Rise in extremism culminates in U.S. Capitol attack

The U.S. Capitol siege in January shocked many people, but it was not a surprise to those who have been closely monitoring groups like QAnon for years.

QAnon is far from the only conspiracy theory surging. There’s been a growing trend towards extremism for years now.

“Extremism in U.S. politics and in our culture has been there for a long time,” said Peter Montgomery, senior fellow at People for the American Way. “What’s really different in the last several years has been the explosive power of social media to enable promoters of extremist ideology to find new recruits and to create communities of believers.”

In 2009, the U.S. Department of Homeland Security released a report warning that radical right wing extremism was on the rise in the country and it could lead to violence.

“There was a flourishing of new right wing coalitions that happened right after President Obama’s election,” said Montgomery. “People were willing to partner with folks that they might otherwise have kept their distance from because they saw him as such a threat to their conservative ideals about how the government should run or about who should be running the government.”

Over the past decade, it’s become increasingly difficult for authorities to pinpoint where the real threats online are coming from.

“Conversations that occur online that include a threat of violence obviously far outweigh the actual use of violence,” said Sam Lichtenstein, a global security analyst for Stratfor, a Rane Company. It’s the proverbial needle in a haystack problem where you have all of these very loud signals that are constantly going on, but to actually find a signal amidst that noise is much harder.”

In the wake of a rise in extremist activity and consequences from the past year, many people are wondering what causes someone to become radicalized.

“Whether it’s deindustrialisation of people losing jobs or it’s the increasing ethnic, racial and religious diversity in the country that some people find very threatening because they have a view that the real America is white, Christian America,” said Montgomery. “Any of those kind of things that change people’s sense of comfort or safety can be exploited by people who want to promote ideologies of certain kinds.”

“They are searching for something in their social life and that sense of belonging, that sense of meaning,” said Lichtenstein. “And they find that in a world of other people that can help explain a very complex society in seemingly simplistic ways and basically say you are on the right side, you’re fighting against the evil side and kind of break down what are otherwise incredibly complex phenomenon.”

In the video below, we examine the rise of extremist groups online, how they have recruited more followers and the fallout of these ideologies that led to violent attacks like the El Paso shooting to the storming of the U.S. Capitol.

Social media censorship

Social media platforms have taken some action most notably after the riots at the Capitol.

They deleted certain accounts linked to false information, but it didn’t take long for people to argue the move was a violation of freedom of speech.

Most recently, a number of social media platforms banned former president Donald Trump’s and some of his allies after the insurrection at the Capitol.

The Pew Research Center said those media companies cited the belief that Trump’s posts violated their terms of use and that his rhetoric could possibly incite more violence.

As a liberal, I may have been relieved when they took away President Trump’s Twitter account at that particular moment,” said Aaron Delwiche, professor in the Dept. of Communication at Trinity University. “But Twitter could just as easily have been silencing somebody I agree with, so I really am uncomfortable with that precedent.”

Republican leaders like Florida gov. Ron Desantis said Silicon Valley tech companies are limiting the freedom of speech of conservative Americans. The Associated Press reported that some Florida lawmakers are now taking action against big tech.

These proposals range from forcing social media platforms to give users a months notice before their accounts are suspended or disabled to allowing consumers to sue if they feel they have been treated unfairly.

“On the one hand, you stop the spread of the information. That’s a good thing if it’s dangerous lies, but it also reinforces the narrative and the people who believe it. They say, ‘look, we’re onto the truth. It shows there’s a cabal out there that’s trying to silence us because they took away our Twitter accounts.’ It makes them martyrs and it kind of intensifies their feelings,” said Delwiche.

In the video below, we examine the debate over social media censorship and why purging the internet of bad information is not so straightforward.

How to battle misinformation

Bad information isn’t new, but many experts have noticed a growing trend in false claims being made that have become more widespread.

It’s usually centered around a major news event. As headlines about one topic or news event spread online, it is only a matter of time before that information is twisted into disinformation.

Battling that disinformation requires people to practice media literacy, which is being able to assess which news clips and stories we consume are trustworthy because it’s easier than ever for unreliable news stories to make their way into our social media feeds.

“You have to sort of feel the sense of responsibility that the information that you share could lead somebody to make a decision that could have really big consequences for their life,” said Tom Trewinnard, co-founder of FATHM, a company that KSAT partnered with in 2020 to create the Trust Index.

“Previously, you kind of had to search this stuff out,” said Mark Gifford, professor in the Writing Dept. at University of Texas at San Antonio. “If you wanted to find some weird stuff, you had to go on weird websites. You had to go on something awful dot com, you had to go in to kind of like the bowels of 4chan boards. Now this stuff that used to be on the periphery of the Internet has made its way into the largest social sites.”

Social media algorithms have also created what’s known as an echo chamber.

“If you’ve ever been to somebody’s house and you watch Netflix, you notice their Netflix looks nothing like yours. It’s a completely different set of choices that are presented,” said Renee Hobbs, director of the media education lab at the University of Rhode Island. “Being sensitive to that can really help us start to realize how our own choices are narrowing or shaping, creating those filter bubbles.”

In the video below, we examine what strategies can be used to battle disinformation, how to seek out trusted news sources and why this can be a polarizing topic among friends and family.


About the Authors: