The Pipeline

How Russian propaganda reaches and influences the U.S. 

This is how a Russian disinformation campaign starts.

A video of a whistleblower with an unbelievable story to tell.

The claims are wild. None of them are true.

Some of them never gain traction.

But others hit the mark.

The videos have reached enormous American audiences, top political influencers and even members of Congress.

The Pipeline

How Russian propaganda reaches and influences the U.S. 

By Brandy Zadrozny 
Oct. 16, 2024

The fake whistleblower videos started popping up last fall, the work of a small but prolific Russian group that researchers call Storm-1516.

Much remains unknown about Storm-1516 — one prong of Russia’s propaganda operation — but it has produced some of the country’s most far-reaching and influential disinformation. 

The Storm-1516 campaigns rely on faked primary sources — audio, video, photos, documents — presented as evidence of the claims’ veracity. They are then laundered through international news sources and influencers to reach their ultimate target: a mainstream Western audience.  

Screenshots from various disinformation stories shared on social media

At least 50 false narratives have been launched this way since last fall, according to a count NBC News assembled with researchers. The narratives aim to diminish Western support for military aid in Ukraine following Russia’s invasion, a contentious issue in Congress. The videos also back the re-election of Donald Trump, who has pledged to halt military aid to Ukraine, while painting the former president as a victim of a “deep state.” And they attack Vice President Kamala Harris. 

In one fake video, a Ukrainian troll farm operative reveals the machinations behind a CIA plot to defeat Trump.

In another piece of propaganda, a woman says she was paralyzed as a child — by Harris in a hit-and-run accident

In a third, a “whistleblower” falsely claims Ukraine’s leaders spent U.S. aid money on yachts.

The claims peddled by the actors and false primary sources in these videos fall apart upon basic inspection. Experts quickly identified Olesya, the Ukrainian troll, as an AI-generated fake. The 2011 hit-and-run never happened, according to San Francisco police, and KBSF-TV, the local news station behind the claims, does not exist. 

And the companies that own the yachts told reporters they hadn’t been sold. But the story spread anyway, and this time, far — from small-time right-wing influencers to members of Congress, including Sen. JD Vance of Ohio, now the Republican vice presidential nominee.

Vance did not respond to a request for comment.

“It is very purposeful, very systematic, and it's a process that we see over and over,” said Darren Linvill, co-director at Clemson University’s Media Forensics Hub, whose team uncovered the Storm-1516 campaign last fall. 

Russia is the most active foreign threat to the 2024 election, according to U.S. officials. A pair of indictments unsealed in September alleged an expansion of Russia’s influence operation: from crude, fake social media posts and bot networks in 2016 to more ambitious and successful recent campaigns. That includes laundering propaganda through seemingly independent U.S. actors, according to one of the indictments. It was a scheme that entangled several popular right-wing creators, including Benny Johnson, Dave Rubin and Tim Pool, who all said last month they had been unaware their employer was working with Russia.

“It’s a bigger threat than it ever was,” U.S. Attorney General Merrick Garland said of Russia's influence operation at a recent news conference, where federal prosecutors unsealed indictments regarding two alleged state-sponsored propaganda schemes. “Russia has meddled in our society and tried to sow discord for decades. Really what we're seeing is just more tools in the toolbox.”

For more on this story, watch NBC's "Nightly News with Lester Holt" tonight at 6:30 p.m. ET/5:30 p.m. CT.

Russia’s Ministry of Foreign Affairs did not respond to a request for comment.

Experts fear Storm-1516 and similar groups have the potential to sway public opinion by undermining the credibility of democratic institutions, influencing U.S. policy and diminishing people’s ability to distinguish fact from fiction. 

Russia is increasingly leaning on an advancing technology landscape — including artificial intelligence and more sophisticated bot networks — to spread its falsehoods. But Storm-1516’s most valuable quality is persistence, according to Clint Watts, the general manager of the Microsoft Threat Analysis Center. 

“The danger comes with just the sheer volume of claims that they make, and it only takes one person, one influencer with outsized influence, to grab onto a video and amplify it in the United States,” Watts said. “They may miss 99 times, but the 100th time they may get it just right.”

The dozens of Storm-1516 storylines examined by NBC News offer a window into the disinformation pipeline: where the most successful false claims originate and how they spread. 

Stage 1: The video

In February, a Russian man falsely confesses on YouTube that Ukraine promised him $4,000 to assassinate Tucker Carlson while he was in Moscow. The man says he was caught before he could detonate a bomb in the underground parking lot of the Four Seasons Hotel there.

Stage 2: The placement

Like a criminal gang’s ill-gotten cash, this video — and its manufactured evidence — is laundered through Kremlin-linked platforms and websites in a quest for legitimacy. First in Russian, then in English.

Stage 3: The spread

Russian propagandists pick up the story. The anonymous account MyLordBebo, known to researchers as a spreader of false stories about Ukraine, first posts the video to X, with English subtitles. Then, a Q-Anon influencer, KanekoaTheGreat, shares it. The videos go viral.

Stage 4: The mainstream

Simon Ateba, the Today News Africa reporter known for disrupting White House press briefings, posts the video. Right-wing creators and pundits leap on the story, using it to ask whether U.S. funding for the war in Ukraine is being weaponized to kill an American citizen. 

Turning Point USA President Charlie Kirk, podcaster Tim Pool, conservative commentator Benny Johnson and comedian Jimmy Dore share the false story with their millions of U.S. followers.

Johnson did not return requests for comment. Similar requests to those who spread the false assassination attempt story went mostly unanswered. Those who responded defended their posts and videos. 

“We reported it as potentially being a hoax,” Dore said. 

Andrew Kolvet, a representative for Kirk and a producer of his podcast, said Kirk noted at the time that it wasn’t possible to verify the claim. 

"Frankly we’re glad to know that Ukraine wasn’t orchestrating an assassination attempt against Tucker,” Kolvet said of the former Fox News host. “That’s a good thing."

Storm-1516’s video production team likely operates out of an office in St. Petersburg and appears to recruit actors from diaspora communities there, researchers at Microsoft said. Based on an analysis of methods and personnel, the researchers believe the group is in part a vestige of the Internet Research Agency, a disinformation factory founded by Yevgeny Prigozhin that meddled in the 2016 U.S. presidential election. Prigozhin, a onetime ally of Russian President Vladimir Putin, led a quickly quashed rebellion against the Russian military in June 2023 and died months later in a plane crash. 

Storm-1516 is loosely tied to the Kremlin by people, products and tactics; Microsoft researchers believe it’s directed by the Center for Geopolitical Expertise, an anti-liberal think tank that, according to Estonian intelligence, organized press tours of Ukraine for Western pro-Putin propagandists. The Foundation for Battling Injustice, a former Prigozhin propaganda operation that imitates a human rights organization, has amplified Storm-1516’s fake videos, researchers say.

Other groups have similar goals but different methods. Storm-1099 is known for its “Doppelganger” operation, which uses fake news websites — dozens of which were recently seized by the Justice Department — and a bot network to push disinformation. Storm-1679 trades in feature-length films that mimic American documentaries and political thrillers, including on the Paris Olympics

Storm-1516’s cheap videos echo Cold War-era propaganda techniques. The most memorable may be the KGB-designed “Operation Denver,” which concocted and spread the false conspiracy theory that the AIDS virus had been engineered by the Pentagon. 

A newspaper clipping claiming "AIDS may invade India"

A known launderer of KGB disinformation, the Indian newspaper Patriot seeds a lie about AIDS on July 17, 1983. (Archive.org)

A known launderer of KGB disinformation, the Indian newspaper Patriot seeds a lie about AIDS on July 17, 1983. (Archive.org)

That campaign began with a letter from an anonymous but “well-known” scientist with insider information published in 1983 in the Patriot, a pro-Soviet Indian newspaper.  

In 2024, Russia’s strategies have evolved, with the creation of more legitimate-looking fake news websites, more sophisticated bot networks and the increasing use of AI. Some of Russia’s disinformation projects are professional productions involving paid actors, while others are slick documentaries with AI-fabricated celebrity hosts. Some target Russian citizens and others the outside world. 

The Storm-1516 videos initially relied on real people, like a Cameroonian woman in St. Petersburg who journalists revealed had posed as a Cartier intern in a viral TikTok video falsely smearing Olena Zelenska, the first lady of Ukraine, from October 2023.

In the last few months, the videos have seemingly leaned on AI, hiding the identity of the subject to further thwart fact-checking efforts. 

Hany Farid, a professor of digital forensics at the University of California, Berkeley, identified evidence of AI manipulation in a handful of recent Storm-1516 videos, including one from July of a man posing as a luxury car salesman to falsely claim to have sold Zelenska a 4.5-million euro Bugatti sportscar.

Blurry faces, disappearing teeth and tongues, discrepancies between words and mouth shape, and videos in which a subject’s body remains uncomfortably still can all be giveaways, Farid said. Some of the fakes he called “shockingly bad.”

“Could they make better videos? Sure,” Farid said. But “these videos basically work. You don't have to make Hollywood-style fake videos to get people to start doubting everything.” 

One of Storm-1516’s most recent bad fakes showed a purported park ranger describing the illegal killing of an endangered black rhino. Flanked by a Zambian flag and a picture of giraffes, the man says that last year, he accompanied a diplomatic mission on a safari. To his amazement, he says, a woman shot and killed a young rhino named Casuba. “A female American politician. Her name was Kamala.”

The video lacked evidence, but it spread anyway. Through Russian Telegram, into English through a Zimbabwean news website, then “The Intel Drop” and the usual verified propagandists to their hundreds of thousands of followers on X.  

The story seemed to stop there. Some posts even acknowledged the video may have strained credulity: “If this is real, she's in a world of worry,” wrote Chay Bowes, an Irish commentator and contributor to Russian state media network RT, in a since-deleted post. 

No matter — maybe the next one would hit.