At 1:29 a.m. March 26, the Dali, a large container ship, struck the Francis Scott Key Bridge in Baltimore, causing its collapse. 

Within hours — as many Americans slept — misinformers on X and other platforms posted wild theories, unsubstantiated claims and speculation about who was to blame for the catastrophe.

Without evidence, misinformers coalesced around the idea that the bridge collapsed because of a coordinated attack. PolitiFact repeatedly saw social media users falsely assign blame to two nations: Israel and Ukraine. 

“If you’re pro-Russia and anti-Ukraine, then it was a Ukrainian attack,” said Mike Rothschild, a journalist and conspiracy theory expert who has written books about conspiracy theories.

As of April 1, there have been no credible reports or evidence that the ship’s collision with the bridge was linked to terrorism or an attack.

Nevertheless, the invented narratives proliferated — often customized to suit individual posters’ preexisting beliefs and brands, researchers said.

Sara Aniano, a disinformation analyst at the Anti-Defamation League’s Center on Extremism, said these claims often came from people who make it their literal business to spread conspiracy theories. 

“A content creator who does makeup tutorials is not much different than the content creator who is selling conspiracy theories,” Aniano said. “These theories and these events are the equivalent of their products.”

We found that X subscribers paying for blue check marks that guarantee greater reach from the platform’s algorithm were responsible for nearly all of the most popular posts linking Israel or Ukraine to the bridge collapse. 

X promotes subscribers’ posts even if they contain unverified or false information. The platform also shares ad revenue with “blue check” subscribers, letting them earn profit when people interact with their posts. 

On X, rampant unverified and false claims persisted, even as verified information emerged

Misinformation experts said bridge collapse conspiracy theories were widespread and also successfully reached a larger, more mainstream audience on X.

PolitiFact used advanced searches on X to analyze more than 100 posts and create a timeline of the anti-Ukraine and anti-Israel narratives that emerged immediately following the incident. Misinformation experts also shared some examples with us.

Here’s our timeline of the day’s events and examples of anti-Ukraine and anti-Israel claims:

1:29 a.m. The Francis Scott Key Bridge collapsed.

3:02 a.m. The earliest posts mentioning Ukraine or Israel did not immediately assign blame, but brought both countries into the bridge collapse discussion.

“Collapse of a Bridge in Baltimore after being hit by a ship. US infrastructures emblematic of a declining/collapsing empire,” one paid X subscriber posted. “Money spent in endless wars, to finance Nazis in Ukraine and baby killers in Gaza rather than taking care of US citizens.”

3:22 a.m. A paid X user with 185,000 followers and a Russian flag emoji in its name asked, “Did Israel just hit the US over not using the Veto power yesterday?”

The U.S. on March 25 abstained from voting on a United Nations Security Council resolution calling for an immediate ceasefire between Israel and Hamas. Three times prior, the U.S. vetoed similar resolutions.

(Screenshots from X.)

4:07 a.m. A paid X subscriber with 24,000 followers posted, “Israel cancels its visit to Washington after the US allows the UN Gaza cease-fire resolution to pass and then the Francis Scott Key Bridge in Baltimore is attacked? This is not a coincidence, nor was it an accident!” 

7:21 a.m. Andrew Tate, a conservative internet personality who is facing rape, human trafficking and gang activity charges in Romania, said to his 9 million followers on X that the ship “was cyber-attacked,” before claiming that “foreign agents of the USA attack digital infrastructure.” We rated his claim False.

9:11 a.m. Alex Jones, a conservative radio host with 2.2 million X followers who is known for spreading conspiracy theories, reshared Tate’s post, adding that the incident “looked deliberate.”

(Screenshots from X.)

9:17 a.m. A paid X user whose account description includes “tweeting for Palestine,” replied to Jones’s reply to Tate, expressing doubt that the ship’s collision with the bridge was a coincidence. 

9:53 a.m. Federal and Maryland state officials said during a press conference that no credible information suggested a terrorist attack caused the collapse.

9:57 a.m. “Our supposed ‘friends’ from Ukraine are enjoying the news of 20 Americans missing after the collapse of the Francis Scott Key Bridge in Baltimore,” one blue check X subscriber posted. “They are claiming it’s punishment for not sending them more billions of our tax dollars.” 

9:59 a.m. An anti-Gov. Katie Hobbs, D-Ariz., blue check subscriber X account with nearly 56,000 followers said “it’s not plausible” that the bridge collapse was accidental “during an election season and in the middle of two theaters or combat in Ukraine and Israel.”

10 a.m. An X subscriber falsely claimed the container ship’s captain was Ukrainian. 

“Here is information circulating regarding the container ship that hit the Francis Scott Key Bridge and its alleged drivers,” read the post. “One is reportedly from Ukraine. The information does not account for any remote operating. Developing.”

Posts shared at 10:11 a.m. and 10:24 a.m. used nearly identical language. 

(Screenshots from X.)

The vessel was crewed by 22 Indian nationals, according to the ship’s management company.

10:24 a.m. A blue check X subscriber whose account features hallmarks of the discredited QAnon conspiracy theory movement posted that something seemed “very off” about the collapse because of a “vessel operator” with “ties to Ukraine.” 

11:31 a.m. “The captain of the ship that hit the bridge in Baltimore is Ukrainian,” wrote one blue check subscriber with 362,500 followers.

12:46 p.m. President Joe Biden said the incident was “a terrible accident,” adding that there was no indication it was caused by “any intentional act.” 

1:10 p.m. An X subscriber account whose name includes Nigerian and Russian flags posted that the Dali’s captain was, “a citizen of Ukraine.” 

We found three more posts echoing the false Ukrainian captain theory before 3 p.m.

(Screenshot from X.)

2:51 p.m. DC Draino, a blue check X subscriber who frequently shares misinformation to approximately 1.4 million followers, amplified claims that the collapse was an attack and questioned who was to blame: “Iran for our support of Israel? Russia for Biden’s support of Ukraine? China … just because?” 

3:54 p.m. The news website Voice of Europe, which has 182,500 followers on X, posted that the captain “may be a citizen of Ukraine.” 

Less than 12 hours later, on March 27, the Czech Foreign Ministry announced that it had sanctioned the leader of Voice of Europe for using the site to spread anti-Ukrainian disinformation. As of April 1, Voice of Europe’s website had been taken down. The site’s X account — which has a gold X verification badge signaling that it is “an official organization on X” — temporarily stopped posting.

(Screenshot from X.)

6:45 p.m. “The Baltimore bridge terror attack stems from the United States didn’t veto a U.N. Resolution on the Gaza ceasefire,” wrote a blue check subscriber whose bio includes a Russian flag before the words “defeat NATO.” “And the U.S. didn’t send Ukraine the $60 billion.”

False claims linked to pro-Russia and paid “MAGA” and QAnon accounts

Some posts sharing anti-Ukraine or anti-Israel sentiment came from accounts that declared support for the conservative “MAGA” movement or used language linked to the QAnon conspiracy theory.

Pro-Russia accounts also promoted these narratives. 

The 3:22 a.m. X post that questioned whether Israel “hit” the U.S. came from a user NewsGuard analyst Coalter Palmer described as a “notorious purveyor of misinformation related to the Russia Ukraine war.” Coalter pointed to two other X posts in which that user falsely claimed the Bucha massacre was a false flag operation and that Ukraine is a “Nazi state.” 

Memetica, a digital investigations company that studies disinformation and violent extremism, found that the false Ukrainian ship captain claim was pushed by pro-Russia accounts and QAnon conspiracy theory promoters, said Adi Cohen the company’s chief operating officer. 

Looking at a sample of X posts from 10 a.m. to 5 p.m. March 26, Cohen said Memetica found that 7% of the accounts sharing that narrative had zero followers, suggesting what researchers call “inauthentic amplification” — accounts created solely to boost the narrative.

Cohen said that the Ukrainian captain claim was promoted by a known element of the Russian disinformation ecosystem, SouthFront, a website the State Department described in a 2020 report as “a multilingual online disinformation site registered in Russia.” 

Where were misinformers most successful and why? 

Most researchers identified Telegram, 4Chan and X as places where this misinformation flourished most, crediting those platforms’ permissive policies about what can be posted and X’s reputation as the go-to platform to discuss breaking news events. 

It’s hard to definitively say where misinformation was worst, because not every platform shares the same data or is easily searchable, experts said. 

Conspiratorial content might have been more contained to fringe platforms once, but such theories are now widespread on platforms including X and TikTok, said the ADL’s Aniano. 

Memetica analysts observed conspiratorial content about the bridge collapse right away on all social media platforms, but especially X, Telegram and TikTok, Cohen said.

Misinformers can use events like the bridge collapse as “another plot point in their broader narrative that the mainstream media is not to be trusted, that our government is not to be trusted, that experts like us are not to be trusted, and that there is always an active attack against America happening,” Aniano said. 

In this image taken from video released by the National Transportation and Safety Board, the cargo ship Dali is stuck under part of the structure of the Francis Scott Key Bridge after the ship hit the bridge, March 26, 2024, in Baltimore. (AP)

Once misinformers seize on an event, experts said, they often assign blame to entities — people, groups, countries — that have also been in recent news headlines.

“Given that the conflicts in the Middle East and Ukraine are ongoing and continued funding to support these various efforts remains a major wedge issue in the United States, it makes sense that they would become fodder for conspiracies and false claims,” said Valerie Wirtschafter, a Brookings Institution fellow in foreign policy and the artificial intelligence and emerging technology initiative. 

Wirtschafter said she suspects this “will likely continue to be the way that these types of narratives take shape — by leveraging prominent and polarizing political topics in times of uncertainty and incomplete information.”

PolitiFact Researcher Caryn Baird contributed to this report.

RELATED: Edited Wikipedia entry doesn’t prove Israel caused the Baltimore bridge collapse

RELATED: No, the captain of the container ship that hit the bridge in Baltimore wasn’t Ukrainian

RELATED: Baltimore bridge collapse: A cyberattack, a movie and other false claims about the ship accident

Source link

You May Also Like

PolitiFact – Un doctor ecuatoriano no fue atacado en televisión, ni creó una cura para la hipertensión

¿Miembros de empresas farmacéuticas golpearon a un doctor en la televisión por…

Did George Washington Die Before the Existence of Dinosaurs Was Discovered?

British Association for the Advancement of Science., and British Association for the…

Did Elon Musk Buy GM?

Claim: Elon Musk bought General Motors. Rating: On Dec. 4, 2022, a…

Elon Musk Called ‘Mediocre White Man’ by Yahoo News Columnist?

Claim: An X user posted a screenshot of a legitimate Yahoo News…