Russian State-Media Aligns With GOP Messaging After Capitol Attack
E Rosalie is an interdisciplinary scholar at the Johns Hopkins Bloomberg School of Public Health with a variety of interests: science policy, health security (public health + national security), and disinformation. She cultivates a range of experiences, and in the past year contributed to the HIT-COVID study published by Nature, volunteered on the Covid Tracking Project at The Atlantic, team-wrote a policy memo published in the Journal of Science Policy and Governance, and started a disinformation database called Hoaxlines via her science communication project, NOVEL SCIENCE. To follow her work, subscribe to NovelScience.Substack.com.
You can read the article below or listen to an audio version of the article here:
Rebranding Russia Today
Following the Jan 6th attack on the Capitol, platforms purged users sharing extremist content from recognized domestic terror threats like Qanon. Censorship, specifically of conservatives, became a major theme in the Russia Today Telegram channel.
Content sets up Putin as a true defender of free speech at the same time it has played up the censorship threat. Nevermind that Putin is likely responsible for the deaths of at least 22 journalists. The shift may hint at their ultimate aim.
A few recent titles from RT on “big tech” and censorship:
- Feb 3: Washington joins crusade against free speech, backs Ukrainian crackdown on opposition media as EU & Zelensky’s dad voice concerns
- Feb 3: Putin says freedom of speech online must be defended against social media companies intent on making ‘profit at any cost’
- Jan 19: Words are violence. Voting is terrorism. Free speech is a threat. Where the media establishment can’t win, they’ll redefine.
A change to Russia Today’s Twitter bio caught my eye following a shift in messaging. The new description invites people to Telegram where RT promises no censorship.
The change reflects an effort to promote the idea of conservative oppression. The messages often demonize non-conservative Americans as an enemy or evil force.
RT’s Telegram messages are rife with well-known authoritarian propaganda tactics, what-about-isms, demonization, discrediting reliable sources of information, and appeals to fear.
Comment sections provide a location for the high-level engagement needed to push audiences toward radical ideologies. Polls in the channel could help RT to create even more effective content with real-time intel on its target audience.
RT updated the Twitter bio on February 1, 2021. The Telegram channel sprang to life on or around Jan 9th. Though the channel existed before 2021, activity before Jan 9th starts in 2018. The channel has either been silent since or deleted content back through 2018.
For more Telegram messages from Russia today please see the library.
RT Telegram Library
The change in tone in the Telegram channel may suggest that they know readers there more readily tolerate extreme content. If not, this could work to “thin the herd” and narrow the pool of people they should aim to influence. The Digital Forensics Lab wrote that extremist groups “use channel- and message-forwarding features, to direct users to increasingly violent content.”
RT behaves similarly. The channel shares divisive content from RT, but it also shares extremist or hyper-partisan American voices. The Kremlin exploits what the Soviets called “polezny durak”–translated as “useful idiot.” People who hold views that run contrary to democratic values or who frequently share misleading or false information can be amplified to serve the Russian agenda.
Authoritarians have the advantage in the short-term when it comes to information warfare. Bad actors pervert free press and speech by exploiting Western media’s “fairness doctrine.” Although that is no longer an official regulation, credible outlets often continue to present all major stances on an issue. This becomes problematic when outlets share views regardless of whether it is incoherent madness. Dignifying a stance with presentation suggests it has merit even when it does not.
Recent examples of RT using hyper-partisan or misleading American voices:
- Feb 3: TARA READE: I believe AOC when she says she is a survivor of sexual assault. Why could she not say the same about me?
- Feb 12: Gina Carano is right to compare cancel culture to a Holocaust precursor
- Jan 27: As the ‘most banned woman in the world’, I know conservatives are being censored by Big Tech. It’s blatant election interference
If an outside actor can insert ideas into a conversation, then the media may spread and lend credibility to it unintentionally. Russia achieves a similar aim by elevating the voices of people in the US whose views benefit the Kremlin.
Russia Today has working relationships with American and Western authors whom it then recommends, publishes, and amplifies. RT and other Russian outlets co-opt American voices knowing they’re protected by freedom of press and speech.
Many of the recommended channels have a history of misleading audiences or publishing outright false content. Jack Posobiec, for example, was an organizer of the Stop the Steal campaign, back in Sept of 2020—for an election that had not yet taken place.
In terms of internet traffic, the US is a distant 2nd for traffic to RT.com, but that’s a considerable number of people in the US getting news from an outlet known for publishing disinformation intended to influence Americans.
The Gift of Gab
The outlet also launched a Gab page in early Feb 2021 and already has a significant following. Gab, like Telegram, provides an opportunity for foreign actors to influence Americans with content and direct engagement not permissible on Facebook or Twitter.
The Gab page for Russia Today was similar to its Telegram channel. Both differ from the Twitter feed, which shares more high-quality, regular reporting. The mostly-reliable content is intentional as it lends credibility to the disinformation when the outlet sandwiches it in with sound reporting.
The Disappearing Parler
Parler, an alternative social media platform promising absolute free speech, emerged under suspicious circumstances before serving as the medium for suspected Russian influence operations. Congress is currently investigating Parler, which disappeared after Amazon Web Services cut the platform’s hosting services. Parler re-appeared shortly after on servers that routed through the Russian Federation.
Now hosted by a company called SkySilk, Parler has struggled back onto the scene with a partially operational platform. New research findings from an analysis from Newsguard and PeakMetrics illustrate the polluted landscape wherein growing numbers of Americans were choosing to spend time. The report states 87% of news posted in the week prior to the attack on the Capitol linked to websites known for publishing mis/disinformation.
Parler users shared the website American Conservatives Today, a site run out of Macedonia, nearly 3000 times in that single week. The website sprang into existence just one month before the attack on the Capitol. Another website called Resist the Mainstream had an Austin, TX address through ParcelPlus, a virtual mail forwarding service. The site runs out of a town called Veles in North Macedonia but looks like many far-right outlets based in the United States. That may mean readers remain unaware of the source of content. Twitter and Facebook have cracked down on foreign click-farms that monetize hate, but Parler has not.
A full 22% of Resist the Mainstream’s traffic comes from Parler. That these sites rose to such prominence illustrates the danger still lurking in Parler’s design. The Stanford Internet Observatory found apps automated content posting for a significant number of the most active user accounts. Entire networks of fake accounts led users to scams off-site.
SkySilk, in its statement addressing the elephant in the room–that they’ve chosen to re-platform the website that mass-live-streamed the Capitol insurrection–stating the company does not support hate speech but that “big tech” has too much power over our lives. It’s unclear how re-platforming Parler addresses the disproportionate power of “big tech.” SkySilk and Parler have both politely dismissed concerns and assert that needed changes have taken place. The platform has dramatically increased the number of moderators but the Stanford study showed moderators often shared and amplified dangerous misinformation and conspiracy theories surrounding Covid.
The most visible change has been the firing CEO John Matze. Given his comments that indicate the falling out had to do with him wanting to moderate more extreme content, a hefty dose of skepticism about the nature of the changes seems wise: “I advocated for more product stability and what I believe is a more effective approach to content moderation.” Fox News obtained a memo circulated to Parler employees in which Matze wrote, “The Parler board controlled by Rebekah Mercer decided to immediately terminate my position as CEO of Parler. I did not participate in this decision.”
Few “guardrails” exist, and the site’s vulnerabilities, which mostly endure, could permit adversaries to collect “kompromat,” a Russian term for blackmail. Even observation could better enable parties wishing to aid extremists without contact. Real-time intel on their target will enable them to craft evermore effective material. A strategic communication memo from the Department of Defense stresses engaging the audience and adapting to feedback, saying “anticipate, monitor, understand and quickly counter or exploit the influence efforts.” The guide testifies to the advantage our adversaries may gain.
Although the site says it doesn’t allow “terrorists, spam, unsolicited ads, pornography, threats to harm, blackmail, and content that glorifies violence against animals,” the lightning-fast rise of Macedonian click-farms shows that any changes to policy will make little difference if not enforced. Discussions surrounding Parler have largely stayed tethered to the issue of free speech, a right that is not nor has it ever been absolute. Given the elevated terror threat, addressing the speech we’re defending seems more relevant than ever. We can ill-afford to be distracted.
How Disinformation Threatens Us
The most noteworthy shift in Russia Today’s message is the intense focus on offering an alternative to censorship and recruiting the aggrieved to its Telegram channel. There the outlet posts frequently about censorship on social media.
Dozens of studies and analyses over several years have shown that platforms do not unfairly target conservatives. Right-leaning posts routinely make up a supermajority of the daily top-performing posts on Facebook. More right-leaning accounts than left were purged following Jan 6 but, critically, no evidence suggests it was because they leaned-right.
Still, conservative leadership persists in making unfounded claims. Intentional or not, these assertions are likely to be politically advantageous because they create a shared grievance, a phenomenon where smaller differences fall from focus and people unite against a larger threat.
The power leadership has over the attitude of followers is tremendous—sufficient to make people accept stances that conflict with their values. Masks are a prime example of the influence leadership has over public reaction. The opinions of others affect us much more than we like to believe, and views, much like a virus, can infect via super-spreaders.
Political elites, specifically Republicans, have increasingly used illiberal (in the autocratic versus democratic sense, not liberal versus conservative), intolerant rhetoric, as assessed by the Varieties of Democracy, an international organization that monitors democracies around the world.
Political elites in the United States have almost certainly played a role in propelling Americans to violence in multiple instances. Extremist activity in the US has surged in the past several years, especially on the far-right.
Former Trump administration official Elizabeth Neumann explicitly stated that political rhetoric has contributed to domestic extremism, both generally and specifically. Words like “invaders” or “criminals” appeal to an audiences’ fear.
Nefarious actors can leverage that fear to influence them.
The skewed growth of far-right extremism isn’t because extremism is exclusive to the right. It’s not. To be clear, far-left and Jihadist extremists exist and pose a real threat; however, neither currently poses anywhere near the danger posed by violent far-right actors. Despite knowledge of the growing far-right threat, the Trump Administration focused on far-left and Jihadist extremism. This choice permitted the unchecked spread of far-right extremism.
Unfounded claims of censorship help extremists recruit by generating fear. Fear and uncertainty negatively affect our critical thinking and those who feel persecuted or silenced become more vulnerable to both extremist recruitment and manipulation by malign actors who wish to shape our opinions and choices.
That these mass influence methods work on Americans is indisputable, and there are multiple ways foreign disinformation can pose a threat to national security. One possibility has already happened, and the threat must be addressed.
When foreign disinformation benefits domestic leaders, it may disincentivize protecting the public from those operations. In July of 2020, DHS withheld a notice about disinformation. No reason was given and the disinformation disadvantaged then-presidential candidate Joseph Biden.
Anytime foreign actors are allowed to mislead the public, that puts the public at risk of being manipulated in unforeseeable ways. Nefarious actors won’t stop at disinforming on the subjects our leaders want. In the right circumstance, this could pose an existential threat.
During the 2016 election cycle, Russian-linked Facebook pages planned anti-Muslim and pro-Muslim rallies at the same location and time in Houston, likely in the hope of sparking violence. Moving Americans to act in public is something the Kremlin already knows how to do.
Thus, allowing disinformation to take root in the country is to aid and abet an adversary in an attack on the US. At the same time, it further erodes the public’s grasp on reality and ability to discern what is true.
A testament to how susceptible Americans are to strategic messaging is the widespread misapprehension that Antifa or Black Lives Matter were closely linked or largely responsible for riots over the summer.
The above images are a sampling of fake Antifa social media posts detected by the Media Manipulation Lab, a project from the Harvard Kennedy Shorenstein Center. These images were a drop in the sea of unfounded claims and disinformation over the summer.
Facebook confirmed for Reuters that a white nationalist group was behind the viral spread of fake screenshots attributed to Antifa members in 2020. Over and over we see the same behavior; over and over, the public opinion shows us why. It works.
Disinformation from domestic sources claimed George Soros was paying to send caravans to the US border. The Monday after Rep. Matt Gaetz (R-FL) misled via Tweet on this subject, a pipe bomb was delivered to Mr. Soros’ house.
The response to the anti-semitic rhetoric did not stop there. One man opened fire on a Pittsburgh Synagogue, the Tree of Life, killing eleven people where they worshiped.
Even though elected-officials knew that Mr. Soros had not funded the caravans, nor anyone else for that matter, they misled people and allowed them to believe lies.
Those lies cost lives. Words matter.
Much in the way, the threat from Antifa was engineered through media manipulation so too have the claims of censorship been skillfully engineered. An example of how it works can be seen here:
- Twitter Banned Project Veritas. Then, Influencers Manipulated Our Perception to Make Us Feel Censored.
In another instance, the largest Facebook page attributed to “Black Lives Matter,” had almost 700,000 followers, more than twice as many as the official Black Lives Matter page. The fake page made at least $100,000, though the person behind it was not based in the US or affiliated with the real Black Lives Matter.
The Institute for Strategic Dialogue examined Black Lives Matter and fake activity online. The report found:
- “The dominant disinformation narrative regarding BLM seeks to portray the ongoing demonstrations around racial injustice in the US as a predominantly violent protest movement.
- A common characteristic in this disinformation conflates the movement with Antifa, the leftwing anti-fascist movement, using the terms interchangeably to describe recent protests and protesters. Further to this, language describing protesters as “rioters,” “terrorists” and “thugs” is common.
- The leading voice in pushing this narrative is Andy Ngô, Editor-at-large with the rightwing news site Post Millennial. Of the 50 most-widespread posts containing BLM disinformation, Ngô’s tweets feature 21 times.
- Frequent attempts were also made by leading right-wing voices to use the merging of BLM and Antifa to target Joe Biden by claiming he has failed to denounce violence at racial injustice protests.
- Disinformation targeting BLM spiked after 23 September, the day the grand jury announced its decision to bring no charges against Louisville police over Breonna Taylor’s death.”
In the minds of those whose reality had been shaped by disinformation, the threat of violence from anarchists like Antifa feels real. They feel silenced, as if their way of life may be taken from them. If one believes that Antifa rioted throughout the country over the summer, that conservatives have been silenced, and the election stolen, as was portrayed in these influence operations, the events on January 6th become more coherent.
For that reason, it is all the more critical that those who deceived the public be held responsible, even more than those manipulated to act that day. But for propaganda, it would never have happened.
The Threat Is Not Hypothetical
An old playbook, one authored by the United States, roughly outlines many of the Kremlin’s actions thus far. Edward Bernays’ book Propaganda formed the basis for the US influence operations “playbook” used in Guatemala. The effort in Guatemala was successful.
We helped overthrow the democratically elected leader of Guatemala. The damage done to the region, and the dozens of other times the US interfered in South America, forces people to flee toward the United States.
A summarized quote from Bernays succinctly illustrates the methodology:
“If you want a man to buy a piano, don’t tell him to buy a piano. Instead, implant the idea that it is fashionable to have a music room among the general public. Then, buying a piano will occur to the man “as if it is his own idea.”
—Snow & Snow, 2020
Bernays’ theory of influence did not work by directly convincing people. Instead, Bernays advised creating the circumstances that led the target to make the desired choice. The last step in the US playbook used in Guatemala was for the US to offer help to the side it favored.
That may be the final step for other parties following the playbook too.
Bernays’ book influenced the Nazi propaganda crafted to demonize the Jewish people during World War 2. It’s a testament to the lethality of this psychological weapon. Disinformation has the power to make normal people capable of the unthinkable. Perhaps more troubling is what it will make us overlook.
The key insight of Bernays was his understanding that manipulating the public was not the same as convincing many individuals at the same time.
“If you can influence the leaders, either with or without their conscious cooperation, you automatically influence the group which they sway,” Bernays wrote in Propaganda (1928).
If we assume Bernays is correct, then the best approach to break down a country would be from the top down. Ensure leaders are divisive, and the public will divide in time.
Two Steps Behind A Ten Step Lead
We were two years behind the influence operation that cast doubt on our 2016 election. Congress alerted the public to the Russian efforts to sow discord and undermine US elections in 2017, nearly three years after the first known efforts.
The effect of this systematic attack has been visible for years now. Disinformation leaves people unable to tell what is true. Civil society and democracy require a shared reality. Without that, we cannot reach a compromise or understanding. Society breaks down.
These efforts ultimately constitute an assault on the US, although disinformation traditionally exists below the threshold for war. RT may simply wish to continue dividing Americans. Still, we should not discount the possibility the Kremlin is presenting itself as sympathetic in a bid to ally itself with the aggrieved against a shared enemy–other Americans.
A report from Free Russia that contained translated KGB documents reads: “Psywar is planned comprehensively, to the entire depth of the operative construction of the enemy’s opposition group.” If we understand active measures operations, the Russian term for political warfare, as intentional, then the shift in messaging may indicate a specific aim. Americans who see an adversary as preferable to other Americans constitute a grave vulnerability.
In 2018, men attended a Trump rally in shirts that read “I’d rather be Russian than a Democrat.” That should have been a wake-up call, but it wasn’t. We now have Americans who wish to overthrow the government and one who stole the Speaker of the House’s computer, intending to give it to Russia. That this is the same country that has been flooding the US information spaces seems unlikely to be coincidental.
A Kremlin-directed Facebook operation tried to instigate violence in the US years ago. The people who arrived at the intentionally scheduled opposing protests at the same time and location, remained unaware of the organizer’s identity until after.
Incentives already exist for political figures via dark money, so it’s not inconceivable that civilians could be motivated similarly. We have evidence that Russian money may have flowed through the NRA to the US, but that is hardly the lone instance. Russia has over $1 trillion dollars in dark assets around the globe and frequently uses oligarchs to do its bidding.
A recent analysis found that the Kremlin’s dark money poses a “serious national security threat to the United States because this money can be exploited and steered by the Kremlin for espionage, terrorism, industrial espionage, bribery, political manipulation, disinformation, and many other nefarious purposes.”
Money travels from Russia to the Cayman Islands and then travels to the United States, mostly through the state of Delaware, and the United Kingdom. Both the US and the UK have suffered major assaults on the democratic process that required inside assistance.
Most worrisome of all, we have high-ranking elected officials furthering foreign disinformation repeatedly, signaling either a willingness to aid or disregard for the threat this poses to national security. Former President Trump and members of Congress have undermined the intelligence community concerning Russia. Downplay has limited political will to address the danger and left the public largely unaware of their vulnerability.
Our plight did not begin in 2016 or even with Operation Infektion in the 1980s. The narratives that now course through extremist communities in the US are not new. They come from disinformation likely authored around 1905 by Mathieu Golovinski, a Russian aristocrat who worked in the original czarist influence operation.
Golovinski wrote “The Protocol of the Elders of Zion,” which featured prominently in Hitler’s demonization of the Jewish people. Today, the document’s anti-semitic themes may be seen in Qanon and in American conspiratorial lore featuring George Soros.
Disinformation and the atrocities it can bring are immortal and we would do well to remember it. The recent shifts from RT give the Kremlin more direct access to receptive Americans.
The question is, to what end.
Subscribe | Twitter | Facebook | Hoaxlines | NOVEL SCIENCE