TikTok and the Israel-Hamas War

The popular app is influencing the opinions of its young user base.

For those looking to understand why young Americans are so anti-Israel, a WSJ report (“How TikTok Brings War Home to Your Child“) may provide one explanation.

Imagine your 13-year-old signs up for TikTok and, while scrolling through videos, lingers on footage of explosions, rockets, and terrified families from the war in Israel and Gaza. 

Your child doesn’t search or follow any accounts. But just pausing on videos about the conflict leads the app to start serving up more war-related content.

That’s what happened to a handful of automated accounts, or bots, that The Wall Street Journal created to understand what TikTok shows young users about the conflict. Those bots, registered as 13-year-old users, browsed TikTok’s For You feed, the highly personalized, never-ending stream of content curated by the algorithm.

Within hours after signing up, TikTok began serving some accounts highly polarized content, reflecting often extreme pro-Palestinian or pro-Israel positions about the conflict. Many stoked fear.

Dozens of these end-of-the-world or alarmist videos were shown more than 150 times across eight accounts registered by the Journal as 13-year-old users. Some urged viewers to prepare for an attack. “If you don’t own a gun, buy one,” one warns.

While a sizable number of the war-related videos served to the Journal’s accounts supported one side or the other in the conflict, a majority supported the Palestinian view.

Research shows that many young people increasingly get their news from TikTok. In the Journal’s experiment, the app served up some videos posted by Western and Arab news organizations. But they were a minority—about one in seven of the conflict videos. The rest were posted largely by influencers, activists, anonymous accounts, newshounds and the Israeli government.

Some of the accounts quickly fell into so-called rabbit holes, where half or more of the videos served to them were related to the war. On one account, the first conflict-related video came up as the 58th video TikTok served. After lingering on several videos like it, the account was soon inundated with videos of protests, suffering children and descriptions of death.

TikTok determines what content to serve to users with a sophisticated algorithm keyed on what its users watch, rather than basing it mostly on what accounts users follow or the content they subscribe to, like some other social media. This makes it challenging for researchers and parents to understand the experiences young people have on the popular app.

The bots the Journal used, which paused only on conflict-related videos, provide a glimpse of the content young users may encounter on the app, as well as a test of the guardrails TikTok sets for what videos it shows them—and in what volume.

While I’ve long been active on social media, I’ve largely ignored TikTok. Not only is it clearly aimed at people way younger than me, I tend to prefer text-based content or videos longer than a few seconds. (That I’m a Federal employee and successive administrations have been warning about the Chinese-owned company for years further dissuades me from spending time on the app.)

But it’s not shocking that, to the extent TikTok serves videos about the war in Gaza, they’re going to be predominantly anti-war and, thus, anti-Israel. Not only is that point of view likelier to be more popular with young users but it’s a lot easier to convey in short videos than the pro-Israel side. And, to the extent the Chinese government is using the app as part of its larger information campaign, inflaming American public opinion against official U.S. foreign policy is in their interest.

Regardless, TikTok is pushing back against the study:

A spokeswoman for TikTok said the Journal’s experiment “in no way reflects the behaviors or experiences of real teens on TikTok.” 

“Real people like, share and search for content, favorite videos, post comments, follow others, and enjoy a wide-range of content on TikTok,” the spokeswoman said in a written statement.

I expect that’s right. My guess is that the vast majority of very young viewers are far more interested in dance crazes, memes, fashion, and cute things on sale at Target* than they are in the Gaza conflict. And TikTik is likely to serve them mostly what they’ve demonstrated an interest in to keep them scrolling.

I’m also a bit perplexed by the study’s focus on 13-year-olds. If I were worried about, say, sexually explicit or violent content, then very young users would indeed be my concern. But given that the subject of this inquiry is the app’s influence on foreign policy views, wouldn’t voting-age users be more interesting?

While I’m having trouble finding up-to-date statistics, a 2021 study found that “42% of TikTok’s users were between the ages of 18 to 24, and 17% of its audience were between 13 and 17 years old.” An otherwise paywalled report tells me that,

As of October 2023, it was found that 18.2 percent of TikTok’s global audience were women between the ages of 18 and 24 years, while male users of the same age made up approximately 18 percent of the platform’s audience. The online audience of the popular social video platform was further composed of 16.3 percent female users aged between 25 and 34 years, and 16.6 percent of male users in the same age group.

So, the 18-34 cohort is not only more interesting in terms of political influence it also happens to be the app’s core demo., comprising 69.1 percent of their users. (Potentially more if their included gender categories beyond male and female.)


The Journal set one of the accounts to a restricted mode, which TikTok says limits content that may not be suitable for all audiences. That didn’t stop the app from inundating the user with war. Soon after signing up, the account’s feed was almost entirely dominated by vivid images and descriptions of the conflict, which began on Oct. 7, when Hamas militants crossed the border and killed about 1,200 people in Israel.

TikTok’s algorithms are uniquely powerful at picking up which videos get users’ attention and then feeding them the most engaging content on the topic. “You don’t have to doomscroll—you can just sit and watch and let the platform do the rest,” said Emerson Brooking, who studies digital platforms as a resident senior fellow at the Digital Forensic Research Lab at the Atlantic Council, a think tank based in Washington, D.C.

Most of the videos TikTok served the Journal’s accounts don’t appear to violate the app’s community policies banning promotion of terrorist groups or content that is gory or extremely violent. But hundreds of the videos described death or showed terrified children. Many of them are difficult to verify.

War-related videos served at least 14 times to the bots are now marked as sensitive and sometimes blocked from being shown to younger users. Hundreds of others were later unavailable on TikTok, either because TikTok removed them or the poster took them down or made them private—but not before they were served to the Journal’s accounts.

“It’s not normal for any adult to see this amount of content, but for kids, it’s like driving 100 miles an hour without any speed bumps, being constantly inundated with demoralizing, emotional content,” said Larissa May, founder of #HalfTheStory, an education nonprofit focused on adolescent digital well-being and the influence of technology on mental health.

TikTok said its family-control features let parents filter out keywords and restrict searches for a child’s account. It also says that since Oct. 7 it has prevented teen accounts from viewing over 1 million videos containing violence or graphic content. 

Of the Journal’s eight test accounts, five fell into rabbit holes within 100 videos after the first conflict-related video appeared. (A bot is considered to be in a rabbit hole if more than half of videos—a rolling average that includes the previous 25 and next 25 videos—are conflict-related.) Two others hit that point within the first 250 videos.

Again: the study intentionally goaded the app to display war-related videos, which it likely would not have done organically. Still, to the extent that a given user (regardless of age) sought out war-related content, the results are interesting.

Similarly to other social-media platforms, much of the war content TikTok served the accounts was pro-Palestinian—accounting for 59% of the more than 4,800 videos served to the bots that the Journal reviewed and deemed relevant to the conflict or war. Some 15% of those shown were pro-Israel.

The spokeswoman for TikTok said the platform doesn’t promote one side of an issue over another. 

TikTok served the Journal’s accounts videos from one pro-Palestinian account more than 90 times. The account, @free.palestine1160, has no bio but includes a link to a Qatar-based charity’s fundraising page for emergency relief and shelter in Gaza. 

The charity didn’t respond to a request for comment. 

While pro-Palestinian content was more common, the feeds were also interspersed with pro-Israel videos, including dozens from the Israeli military and government.

“I saw little kids who were beheaded. We didn’t know which head belonged to which kid,” an aid worker said in one video TikTok served the accounts. After being seen by a Journal bot, the video was later removed.

Hundreds of the videos TikTok showed the Journal accounts evoke death without showing it directly. One video described mass graves in Gaza and watching “the bodies of children being stored in ice-cream trucks because the morgues are so full of the dead.” 

Content can still be traumatizing to children even if it isn’t visually graphic.

“No child should be watching video after video of kids in war for hours a day,” May said. “Being hit with this content day after day starts to prune kids’ ability to emote and process what they are seeing, and they just start being apathetic.”

Again, the very nature of the algorithm is that it’s going to serve up more attention-getting content. And it’s simply much easier to convey the Palestinian side—the war is awful, killing children in mass numbers—than the Israeli side—the killing is targeted at Hamas militants who are violating the laws of war by hiding among innocents.

Then again, as noted in a somewhat-related Guardian report (“TikTok moderators struggling to assess Israel-Gaza content“),

In a press release issued last month, TikTok said that more young people supported Palestine. “Attitudes among young people skewed toward Palestine long before TikTok existed,” the platform said. “Support for Israel (as compared to sympathy for Palestine) has been lower among younger Americans for some time. This is evidenced by looking at Gallup polling data of millennials dating as far back as 2010, long before TikTok even existed.”

Indeed, to a large extent, world opinion has slowly turned against Israel going all the way back to the first Intifada. The extent to which TikTok, or social media more broadly, is exacerbating that trend is really hard to disagregate.


*I may well be extrapolating too heavily based on what my 20-year-old views on the app, or at least talks about with us.

FILED UNDER: Media, Middle East, World Politics, , , , , , , , , , , , , ,
James Joyner
About James Joyner
James Joyner is Professor and Department Head of Security Studies at Marine Corps University's Command and Staff College. He's a former Army officer and Desert Storm veteran. Views expressed here are his own. Follow James on Twitter @DrJJoyner.


  1. Sleeping Dog says:

    However the the current Gaza war and the larger question of the future of the Palestinian people play out, two things are evident; Israel will not destroy Hamas and Israel has lost the public relations battle.

  2. gVOR10 says:

    20,000 dead Gazans may also be having some influence.

  3. MarkedMan says:

    Still, to the extent that a given user (regardless of age) sought out war-related content, the results are interesting.

    I don’t really have an opinion about the research, but I think the statement above mischaracterizes the way the bots worked. According to your excerpts, the bot did not seek anything out but merely paused on certain articles while scrolling through a feed.

  4. James Joyner says:

    @MarkedMan: Right. They paused on the content, causing the algorithm to feed more of that content.

  5. MarkedMan says:

    @James Joyner: Right, but I wouldn’t consider that “seeking out”. If a teen was scrolling through a feed comprised of kittens and silly dances and suddenly saw a picture of dead bodies, causing them to hesitate, the algorithm gives them more like it. The kid doesn’t have to seek anything out. You’re aware of how the algorithm works, but I think the researchers were doing a service in explaining that to others, since I suspect only a small percentage understands this.

  6. Stormy Dragon says:

    I think the study has the causation backwards: young people do not support the Israeli occupation and made tiktok that way because they’re on tiktok more, rather than tiktok shifting young people that way

  7. Stormy Dragon says:

    @Stormy Dragon:

    The parts of Discord and Mastodon I hang out in are pretty negative on this war too, and both of those systems have no algorithmic content recommendations, so it’s not just people being manipulated into those views, they’re arising organically.

  8. Gustopher says:

    There’s been a right wing panic about TikTok for a while, and we should remember to view the Wall Street Journal’s reporting through that lens. That panic gets mixed in with the fact that the app is Chinese-owned spyware (as opposed to the Good, American Spyware that is Facebook or Twitter).

    For my entire life, younger people have been (by and large) further to the left, and more supportive of social justice than the olds, and I expect that to have been the case before I was alive as well. And, there have always been old people blaming that on anything other than the fact that their ideas and beliefs suck.

    If it wasn’t for that Rock and Roll “music” kids would realize that a woman’s place is in the kitchen, a black man’s place is on a plantation, and a corporation’s tax rate should be lower. We need to ban dancing!

    This “kids are too liberal/progressive” predates TikTok. Before TikTok there was Twitter (now a Nazi hellscape), Facebook (now Boomer central), MySpace (no one knows if it is still there), AOL (please be gone), and eventually you get back to stoned kids in dorm rooms discussing Noam Chomsky.

    Maybe the problem isn’t that kids are watching TikTok but that the olds made their minds up a few decades ago and haven’t looked at things with fresh eyes.

    “No child should be watching video after video of kids in war for hours a day,” May said. “Being hit with this content day after day starts to prune kids’ ability to emote and process what they are seeing, and they just start being apathetic.”

    That doesn’t seem to be what is happening though. They process, they emote, and they are rather upset about the whole thing. Not apathetic at all, just not listening to the Wall Street Journal.

  9. DK says:

    And it’s simply much easier to convey the Palestinian side—the war is awful, killing children in mass numbers—than the Israeli side—the killing is targeted at Hamas militants who are violating the laws of war by hiding among innocents.

    It’s not “Palestinian” to be horrified by images of dead, maimed, suffering children. It’s just human. The WSJ wants to talk about the vector and not the content, because the WSJ has a problem seeing Palestinians as human. Just like Hamas wants to talk about grievances and not the horrific, barbaric acts committed on 7 Oct because Palestinian extremists cannot acknowledge Israeli humanity.

    The “Israeli side”/propaganda isn’t working because it’s obvious window dressing.

    The IDF has repeatedly targeted journalists who are clearly Western and clearly marked as journalists.

    The IDF mowed down three Israeli hostages who were by themselves, wearing no armor, holding no weapons, shirtless and underpants, speaking in Hebrew, and waving a white flag.

    There’s no reason to have mistaken those people for Hamas. They got killed because Netanyahu’s violent, terroristic coalition has encouraged the army to kill and commit war crimes indiscriminately. This is not counterterrorism, it’s just bloody revenge intended to cover up Netanyahu’s myriad failures.

  10. Gustopher says:


    This is not counterterrorism, it’s just bloody revenge intended to cover up Netanyahu’s myriad failures.

    I think you assign too much responsibility to Netanyahu, and not enough to everyone else’s individual interests. Netanyahu leads a coalition government, and that coalition expanded after Oct 7th, to become a National Unity Government.

    Apartheid cannot exist without thinking of the oppressed as less-than-us, and that makes exterminating the oppressed easier to swallow.

    To quote popular-with-the-TikTok-crowd musician Will Wood, “The devil made me do it, but I also kinda wanted to.”

    https://youtu.be/LTshg_CQLL4 (warning: ugly visuals, not graphic or anything, just kind of bad. And flashing)

  11. Just Another Ex-Republican says:

    @Gustopher: @DK: War sucks. Stressed/scared/angry young individuals amped up and seeing danger everywhere, react instinctively and *fast*, not logically or carefully. Friendly fire incidents, unintentional civilian killings…these are always part of the price for going to war. One of the (many) reasons governments should be more hesitant to do so, yet virtually every group of political leaders (and many military higher-ups) seem to forget this and are constantly surprised when something goes wrong. You simply NEVER have control of how all the violence will play out.

  12. Ken_L says:

    Imagine your 13-year-old signs up for TikTok and, while scrolling through videos, lingers on footage of explosions, rockets, and terrified families from the war in Israel and Gaza.

    Is this fundamentally different to the experience of a 13 year-old who watched cable news 20 years ago, or network news 20 years before that, or read his parents’ newspaper in the 1950s? The images might be better quality, but the news media has always given prominence to stories about horrible events; perhaps rightly so. I’m not sure a youngster watching videos from Gaza on TikTok is any more likely to be either traumatised or radicalised than one who saw the famous pictures of naked children burned by napalm running down a road in Vietnam, or the Viet Cong prisoner shot in the head by a South Vietnamese general.