Skip to main content

How Misinformation Spreads through Conflict

Three experts break down how misinformation and propaganda spread through conflict and how to debunk it yourself.

A young woman focuses on a computer screen that lights the top half of her face, with the bottom half in shadow.
Illustration of a Bohr atom model spinning around the words Science Quickly with various science and medicine related icons around the text

[CLIP: MSNBC NEWS: “Disinformation and Propaganda.”]

[INTRO MUSIC]

[CLIP: CNN NEWS: “Viral videos about this war that are having huge impact. But they’re completely fake. But they’re having dire consequences.”]


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


[CLIP: NBC NEWS: “Dozens of accounts on X, formerly known as Twitter, spreading rather what’s believed to be coordinated posts with disinformation about the war.”]

Sophie Bushwick: When any major news event happens, a lot of us have the same impulse ...

Tulika Bose: We go to social media to follow the latest updates.

Bushwick: But that can sometimes backfire.

Tulika Bose: Right now we’re seeing viral misinformation everywhere—especially during the current Israel-Hamas conflict. 

[CLIP:MSNBC NEWS: “Spreading fast, influencing opinion and making it difficult for anyone who uses social media to decipher what’s really happening on the ground, in the Middle East.”]

Bose: So we asked the experts why this is happening and what you can do to avoid being taken in.

Bushwick: I’m Sophie Bushwick, tech editor at Scientific American.

Bose: I’m Tulika Bose, senior multimedia editor. And this is Science, Quickly.

[CLIP: Intro music]

Bushwick: On social media platforms, some footage and photos that have been attributed to the Israel-Hamas conflict are actually fake, or mislabeled or both.

Bose: Think video game footage passed off as a real missile attack from Israel or parachute jumpers in Egypt mischaracterized as a Hamas attack.

[CLIP: Sound from R3 video]

Bushwick: Fake content like this has been viewed millions of times. 

Bose: Propaganda during wartime isn’t new. But online misinformation, spread by social media influencers, seems particularly bad around this conflict. 

[Read more about the satellite tech revealing Gaza’s destruction]

Shayan Sardarizadeh: In terms of what I’m seeing, it’s definitely comparable and similar to what I was seeing in the first few weeks of the war in Ukraine, after Russia invaded invaded Ukraine. And by that, I mean most of the misinformation ... has been [from] ordinary people, regular users on the Internet, who are trying to do what is known as engagement farming.

Bose: That’s Shayan Sardarizadeh, who works on the BBC Verify team. His job is to take viral photos and videos about the conflict and investigate them. He’s found something really disturbing about all of this.

Sardarizadeh: It’s probably one of the worst examples of misinformation that you use the pain and suffering of people, genuine people, civilians on the ground while being impacted by this conflict, to basically farm engagement and build up your influence online. In some cases, I’ve seen TikTokers who are, you know, claiming to be, to be sharing live streams on the ground from either Israel or Gaza and, you know, the live stream has got two, three million viewers.

Bushwick: That’s because people are more likely to share something that makes them emotionally engaged ...

Bose: Even if those emotions are negative.

Sardarizadeh: And also, most of the time you’re posting stuff that is a bit shocking, a bit, sort of controversial, that it will get engagement. And then you will be able to build up your influence. You, you’ll gain followers and, you know—if you’re operating on [a] platform that basically pays your money for, for high engagement like YouTube, like TikTok, right, Twitter or X, as it’s known these days. 

Bushwick: These incentives Shayan talks about are baked into social media. Platforms are designed to keep people on the site as long as possible. And as part of that, they reward individual accounts for earning engagement from other users. So that creates a motivation for unscrupulous influencers to post whatever will get them the most attention.

Bose: Sometimes those attention-getting posts make false accusations that real people are the ones making things up. In one example Shayan found, a far-right Indian influencer claimed that Palestinian refugees in a bombed refugee camp were actually so-called crisis actors.

Bushwick: He falsely stated they were staging their grief for the cameras.

Bose: Yeah. Shayan personally verified that the video in question featured a man who had lost three of his children.

Bushwick: Accusing people of being so-called crisis actors can also happen with mislabeled footage. For instance, Snopes debunked a video on TikTok that claimed to show an Israeli crisis actor pretending to be a recent victim of Hamas. The video does show an actor being positioned on the ground as if he’s injured, it’s completely unrelated to the current conflict—it’s from an April 2022 film shoot.

Bose: Shayan pointed out that a lot of these attention-grabbing accounts are falsely passing themselves off as journalists or open-source intelligence ...

Bushwick: Aka OSINT experts ...

Bose: Which distracts from the true citizen journalists and data analysts.

[CLIP: “Hello, everyone. This is Bisan from Gaza. More than 50,000 people to 60,000 people are evacuating to Al-Shifa Hospital and still evacuating every day. People are eating, sleeping, living here.”]

Sardarizadeh: Tons and tons of videos that, that news outlets have shared ... has come from people on the ground, either in Israel or Gaza, using their smartphones to record and document what’s going on.

[CLIP: Plestia: “They bombed really close to my house, that’s my window right now, that’s the view... (gasps)”]

Sardarizadeh: And then somebody or the other major newsroom, you know, somebody like me or my colleagues in my team ... has sat down, looked at that content, verified it and decided, okay, that’s good. We can use it as genuine content from Israel-Gaza.... And, you know, my work would have been, would have been impossible without that.

Bose: But here’s the thing: the combination of a lack of moderation on platforms, the rapid spread of misinformation during a conflict and the incentive of people to get more likes and views is creating this perfect storm. So, Sophie, you cover so much of this, especially in the tech space. What are your thoughts?

Bushwick: There’s this oft-misquoted phrase about truth and lies. I’ll give you the Terry Pratchett version: “Lies could run around the world before the truth could get its boots on.”

Bose: Oh wow. That’s prescient.

Bushwick: It’s very easy for anyone to post an unverified photo or story and have it go viral. But if you’re, if you care about the truth, if you’re updating that information with a fact-check, it takes you much longer because you have to actually verify the truth and then chase down all the runaway versions of the lie, and that might have spread beyond social media by the time you show up to debunk it.

Bose: And social platforms aren’t policing this, right?

Bushwick: Right. A lot of platforms did establish really strong moderation policies after the 2016 election, but more recently, a lot of them have laid off or reduced the trust and safety teams in charge of this, and that makes them much slower to respond to misinformation. This enables the most extreme information out there to take off—even when it’s not true.

Bose: Yes. And I think one of the things we’ve really seen too, like one of the biggest issues, is like when we know that viral misinformation, like if it starts on social or if it starts somewhere else, also makes its way up to news outlets.

Bushwick: Absolutely, and that sometimes happens when outlets report on a narrative without mentioning that the source of their information could be biased. For instance, publications like the New York Times,, CNN  and even the BBC. have reported stories that were based on claims from sources that hadtheir own biases and agendas. And then the outlets have had to walk back their reporting when a fuller picture emerged from additional sources of information.

Bose: Right that need to be first, right? 

Bushwick: Right, yes, instead of trying to get a more fuller picture before they publish. 

Bose: Exactly. I mean I know one journalist that’s been really active on X, formerly Twitter, in asking outlets and reporters to stop sharing viral misinfo is you know, Los Angeles Times investigative reporter Adam Elmahrek.

Adam Elmahrek: I don’t think in the history of, at least in my career—I have not seen misinformation spread so wildly about this war.... Unfortunately, the real problem, from my vantage point as a reporter of the mainstream media, is that stuff percolates up to the mainstream media.

Bushwick: For what it’s worth, I’m incredibly indebted to our own in-house fact-checkers — 

Bose: — Thank you, thank you!

Bushwick: — Who help catch mistakes before we embarrass ourselves.

Bose: We love them. Copy editors are heroes, especially in times like this. But what are some examples of media sharing misinformation?

Bushwick: One case is an explosion at al-Ahli Arab Hospital in Gaza City. Shayan and other investigators — they’re actually still examining the visual evidence to figure out who was responsible.

Sardarizadeh: Everyday my work has been since that, that night, going through all the social media for the CCTV footage, images of the blast site, the crater, the damage to, to the hospital building, the car, park, the surroundings, you know, the people who were injured, you know, going through all of those videos and talking to experts and talking to, talking to people, our reporters on the ground, you know. We were lucky enough to have a couple of reporters. Not many news organizations currently have reporters on the ground in Gaza. We do and that is a huge help to our work. 

Bose: Multiple outlets, such as like the New York Times, reported that this strike came from Israel before they had evidence of that, and they later had to issue retractions. Reporters are still trying to figure out the true story of what happened.

Bose: In addition to the contested strike at Aal-Ahli Arab Hospital, Israel’s government has targeted Al-Shifa Hospital, and has claimed that it was a command center for Hamas. Under the laws of war, hospitals lose special protection if they are used for “harmful” acts. 

Bushwick: This raid, which killed a number of civilians, was roundly criticized by Doctors without Borders, the United Nations and the World Health Organization. 

Bose: Earlier this month the Israel Defense Forces, or IDF, released a video in which an IDF spokesperson claimed to show evidence that Hamas had held hostages in the basement of the al-Rantisi pediatric hospital. In it, the spokesperson described a document written in Arabic as a, as a quote — 

[CLIP] “A guarding list where every terrorist writes his name and every terrorist has his own shift guarding the people that were here.” 

Bose: Soon after CNN broadcast a report in which he made the same claim. 

[CLIP: Daniel Hagari speaking in CNN video: “This is a guarding list. Every terrorist has his own shift.” In this room, he says, a guard list, that begins October 7th, and ends November 3rd, not long before the hospital was evacuated.”]

Bushwick: It turned out to be a calendar with the days of the week since October 7 under the title “Al-Aqsa Flood Battle,” Hamas’s name for its surprise attack on Israel on that date. 

Bose: But there weren’t any names at all.

Bushwick: Which is what both the IDF video and the CNN broadcast had claimed. 

Bose: And critics online were quick to point that out — especially because the calendar in Arabic was clearly visible.

Bushwick: Israel’s government has since called this a “translation error,” and the claims have been walked back. 

Bose: And — HuffPost reported that CNN quietly took out the clip about the calendar in subsequent broadcasts and material posted online. But — this whole had situation had another effect.

Bushwick: It has undermined public trust and fed into people’s existing confirmation biases. 

Bose: — if Hamas was holding hostages at this hospital, the misinformation, (even if unintentional), became a really big part of the story instead. 

Bushwick: That’s what people focused on. 

Bushwick: Another example of a story that was widely reported before there was confirmation was pretty disturbing—so I’m going to recommend that sensitive listeners skip ahead by about five minutes. 

Bose: Here’s a clip from the White House on October 11, 2023 

[CLIP: Biden remarks from October 11: “I never really thought that I would see, and have confirmed, pictures of terrorists beheading children. I never thought I’d ever… anyway.”]

Bose: It turns out that this wasn’t true—and Adam had actually been trying to warn people about this.

Elmahrek:When I tweeted that, that skepticism thread, or maybe the day before, when I tweeted about the beheaded babies claim and said, “Wait, let’s figure out what’s really going on here. Let’s vet this a little bit more, more before we spread it.” President Biden went live on air and said that he had, he had seen and confirmed photos of beheaded—that Hamas terrorists were beheading babies. He had to walk that back; he had to retract that. And all, as all of this is happening in real time, I kept trying to warn my colleagues vet this stuff, don’t take it at face value—question them on it, pressing for the evidence, you know, don’t just rebroadcast this claim without any skepticism, without, without saying that this has been vetted because this is going to have real dire consequences.... Unfortunately, a lot of media did not heed that warning and had to walk back claims, walk back to beheaded babies claim.

[CLIP PRESS CONFERENCE:] Just wondering if you could explain to us how the President came to say yesterday that he’d seen pictures of militants beheading children. Obviously it’s important to make sure that disinformation doesn’t get out there. How did he end up saying that? He was referring to images that many of you may have seen, many of your colleagues have reported on – and obviously Israeli officials have spoken to. [media clamor.] But has the President actually seen the photos? 

Bose: NBC reported that, quote, “beheaded babies,” unquote, had 44 million impressions on X alone within a day of the claim being made. 

Bushwick: And it’s a big problem when misinformation is being shared by trusted news outlets, or government officials.

Bose: By the way, Biden drew criticism just recently for repeating this claim about beheaded babies.

Bushwick: Yes, and having official sources like Biden share misinformation like this helps lies run around the world even faster. Plus, it increases distrust among readers and listeners—people can end up discounting real news. That’s why it’s so important for journalists to fact-check their sources, especially if those sources have been known to share misleading information in the past. 

Bose: It’s better to get things right the first time, rather than making a mistake and then issuing a correction later.

Bushwick: Because even when there’s a correction, people never pay as much attention to it as they do to the original post. One of the experts we interviewed, Emily Bell, talked about this.

Emily Bell: My name is Emily Bell. I’m the director of the Tow Center for Digital Journalism at Columbia Journalism School in New York [City]. And I have been writing about and studying the Internet since the early ’1990s, so that’s a very long time. Different stories and narratives get created and spread.... It’s really hard to do anything about them ... once they’re out there because, you know, you’re just sort of left as, like, one or two voices going, “But that’s not true.” And people [are] like, “Doesn’t matter whether it’s true—it’s, like, broadly true.”

Bose: Adam also talked about something called atrocity propaganda. It’s misinfo that’s more likely to be spread during the first days of a war or conflict—generally to inflame people’s emotions—and it turns out it has a long history.

Elmahrek: During the Gulf War, a girl testified in Congress and said that Iraqi soldiers had been taking babies from their incubators… scores of babies, removing them and leaving them out to die. This was as the U.S. was trying to drum up support for the war that—the first one, or the Gulf War in Iraq—that claim, that testimony later turned out to be a fabrication, and the girl was the daughter of the Kuwaiti ambassador to the U.S. So, you know, this stuff has a long and documented history.

Bushwick: But the history of atrocity propaganda is way older than the Gulf War. Adam also brought us an example from the Middle Ages—one that specifically targeted Jewish people.

Elmahrek: In the context of Jewish people, we call it a blood libel. Because, you know, one of the old—one of the oldest claims that goes back to the Middle Ages is that Jews are secretly drinking the blood of Gentile children. So this is another form of atrocity propaganda in order to demonize a certain community and motivate people to take violent action against them.

Bose: When we hear something terrible, like atrocity propaganda, we often share it without waiting to check whether it’s true.

Bushwick: The thing is, terrible things have happened! Israelis experienced horrible terror and violence on October 7. Palestinians hae been dying in airstrikes and being displaced from their homes. It’s understandable that all of our emotions are high.

Bose: Absolutely. And misinformation is being used to further manipulate those emotions, in order to gain attention and profit, to serve political ends, or even to stoke more violence. 

Bushwick: Since the initial Hamas attack on October 7, the Arab American Anti-Discrimination Committee says it has received hundreds of hate incident reports against Palestinian, Arab and Muslim Americans. Hearing aboutSome of these incidents is heartbreaking.

[CLIP: A 71-year-old man who had been accused of fatally stabbing a six-year-old boy — and seriousy injuring his mother — because of their Islamic faith, and the Israel-Hamas war, has been charged with a hate crime.]

Bose: Antisemitic attacks are also on the rise. The Anti-Defamation League, or ADL, says there has been a steep increase in antisemitic incidents. Between October 7 and 23, the ADL says, there were nearly 200 incidents that were specifically linked to the current Israel-Hamas conflict. And these incidents aren’t limited to the U.S. On October 18 there was also a firebomb attack on a Jewish synagogue in Berlin.

Bushwick: That’s really awful.

Bose: Absolutely. And historically, misinformation can even be an inspiration for war. Think back to when the U.S. went to war in Iraq in 2003, saying we had to protect ourselves from weapons of mass destruction that we never actually found.

Bushwick: Hundreds of thousands of people died as a result of that conflict.

Bose: So yeah. The stakes are high.

Bushwick: But there are a few steps you can take to make sure you minimize the spread of misinformation.

Bell: There are two ways to approach receiving information, one of which the researchers would [call] “prebunking,” which is informing yourself about the situation whether or not there might be a disinformation campaign around it.

Bushwick: So, for instance, when you’re consuming news about this conflict, you should start from the perspective that there is a strong motivation for folks to spread misinformation, whether that’s propaganda from one group or another or people who only care about engagement farming. That way, you’ll be mentally prepared for any misinformation you encounter...

Bell: And in studies that they did at the Institute for the Study of Propaganda [Institute for Propaganda Analysis] at Columbia University back in, like, 1942—proved that actually, the most effective one was just ... telling people, like explaining why people are seeing things and why they have been ... described in a certain way.... There’s somebody who did a great study on teenagers and smoking and, like, saying to them, “It’s gonna kill you.” No effect, you know, [of] saying, “Don’t do that.” No effect. Saying, “So, you know, how tobacco companies advertise to you...,” that has a much bigger effect in discouraging teenagers from smoking; [it] is actually explaining to them what [the] marketing apparatus of Big Tobacco is.

Bose: And while you’re reading news, you can remember an acronym called SIFT, which was developed by Mike Caulfield, an expert on digital literacy at the University of Washington. It’s short for ...

Bushwick: S: Stop—wait for your initial emotional reaction to calm down before you do anything.

Bose: I: Investigate the source—try to figure out if the person or outlet reporting this news is legit.

Bushwick: F: Find better coverage—research who else is covering the same event and if they have a different take on the situation.

Bose: And T: Trace claims, quotes, and media back to their original context. Who provided the quotes or images included in the story, and do they have biases that might skew their perspective?

Bushwick: You can also investigate some content yourself.

Bose: Specifically for something like photos. Shayan had great advice for debunking false images.

Sardarizadeh: So for images, I encourage everybody and—I recommend people start using reverse image search and get themselves accustomed to it because it’s quick, it’s easy, it’s simple. You will see how much misinformation you can check for yourself. There’s a, there’s a new tool that’s been developed by Google called Google Lens. It’s a really good tool by the way. At no cost to you, completely free, will check that image for you ... and will give you an idea of other examples of that image being posted online and what the context behind it is.... It, it takes minutes. Or you can just do it and go to images.google.com and either copy the URL of that image on, on whichever social media platform you’re on or take a screen grab of it yourself. You could just do it, and go to images.google.com and just put the screen grab in there, and run reverse image search for yourself. 

Bose: I’m going to note that the average citizen can check images. But for video, it’s quite a bit harder.

Sardarizadeh: When it comes to video, I have to say [it] can get a bit more complex. You know, I have been involved in projects with my colleagues at BBC Verify where, you know, we’ve spent days, sometimes weeks, investigating videos.... But sometimes, particularly in the war zone, there are no quick and easy answers. And there’s quite a lot of nuance. It takes, you know, a team of professional journalists, experienced journalists, and not just on their own—also other experts, people who know about blasts, explosions, weapons.... You will need the opinions of a lot of people, need to consult quite a lot of people to find out what a very, very complex piece of video necessarily shows, so — 

Bushwick: It’s really important to make a habit of verifying information before you reshare it. But there’s one really faint silver lining to the current misinformation environment.

Bose: Are you talking about generative AI, by chance?

Bushwick: You got it. When the conflict started, I worried that people were gonna use AI to make up fake images or write fake posts, but it doesn’t seem to be playing a huge role at the moment. On the one hand that’s good, on the other hand that’s because there are plenty of other sources of misleading content.

Sardarizadeh: I am starting to see some AI-generated images. I think that's still not, at any level, close to like misleading old videos, unrelated videos from past conflicts, from video games..., from events that have nothing to do with the war. Those are still the main sources of misinformation that I'm seeing.... But I’ve started to see AI generated images that well. Thankfully the ones I’m seeing are not that realistic. But we had two deepfakes in the middle of the Ukraine war. In this case I have yet to find one. And we'll continue to monitor it obviously. The sort of nature of misinformation might actually change as we go through the coming days and weeks.

Bushwick: That’s actually lucky for us, because existing tech tools for identifying whether a photo is a deepfake are not very accurate.

Bose: That’s not great. But AI technology is actually causing a related problem: people have dismissed real photos and videos because they claim that they’re actually deepfakes.

Bushwick: Yes, exactly. In some cases, it’s like a new high-tech version of crisis actor accusations. Instead of claiming that people in photos or videos are actors, you claim they don’t exist at all.

[CLIP: Closing music]

Bose:Science, Quickly is produced by Jeff DelViscio, Tulika Bose, Kelso Harper and Carin Leong. Our show is edited by Elah Feder and Alexa Lim. Our theme music was composed by Dominic Smith.

Bushwick: Don’t forget to subscribe to Science, Quickly wherever you get your podcasts. For more in-depth science news and features, go to ScientificAmerican.com. And if you like the show, give us a rating or review!

Bose: For Scientific American’s Science, Quickly, I’m Tulika Bose.

Bushwick: And I’m Sophie Bushwick.

Tulika Bose is senior multimedia editor at Scientific American.

More by Tulika Bose

Sophie Bushwick is tech editor at Scientific American. She runs the daily technology news coverage for the website, writes about everything from artificial intelligence to jumping robots for both digital and print publication, records YouTube and TikTok videos and hosts the podcast Tech, Quickly. Bushwick also makes frequent appearances on radio shows such as Science Friday and television networks, including CBS, MSNBC and National Geographic. She has more than a decade of experience as a science journalist based in New York City and previously worked at outlets such as Popular Science,Discover and Gizmodo. Follow Bushwick on X (formerly Twitter) @sophiebushwick

More by Sophie Bushwick
How Misinformation Spreads through Conflict