TikTok and the Israel-Hamas War
The popular app is influencing the opinions of its young user base.
For those looking to understand why young Americans are so anti-Israel, a WSJ report (“How TikTok Brings War Home to Your Child“) may provide one explanation.
Imagine your 13-year-old signs up for TikTok and, while scrolling through videos, lingers on footage of explosions, rockets, and terrified families from the war in Israel and Gaza.
Your child doesn’t search or follow any accounts. But just pausing on videos about the conflict leads the app to start serving up more war-related content.
That’s what happened to a handful of automated accounts, or bots, that The Wall Street Journal created to understand what TikTok shows young users about the conflict. Those bots, registered as 13-year-old users, browsed TikTok’s For You feed, the highly personalized, never-ending stream of content curated by the algorithm.
Within hours after signing up, TikTok began serving some accounts highly polarized content, reflecting often extreme pro-Palestinian or pro-Israel positions about the conflict. Many stoked fear.
Dozens of these end-of-the-world or alarmist videos were shown more than 150 times across eight accounts registered by the Journal as 13-year-old users. Some urged viewers to prepare for an attack. “If you don’t own a gun, buy one,” one warns.
While a sizable number of the war-related videos served to the Journal’s accounts supported one side or the other in the conflict, a majority supported the Palestinian view.
Research shows that many young people increasingly get their news from TikTok. In the Journal’s experiment, the app served up some videos posted by Western and Arab news organizations. But they were a minority—about one in seven of the conflict videos. The rest were posted largely by influencers, activists, anonymous accounts, newshounds and the Israeli government.
Some of the accounts quickly fell into so-called rabbit holes, where half or more of the videos served to them were related to the war. On one account, the first conflict-related video came up as the 58th video TikTok served. After lingering on several videos like it, the account was soon inundated with videos of protests, suffering children and descriptions of death.
TikTok determines what content to serve to users with a sophisticated algorithm keyed on what its users watch, rather than basing it mostly on what accounts users follow or the content they subscribe to, like some other social media. This makes it challenging for researchers and parents to understand the experiences young people have on the popular app.
The bots the Journal used, which paused only on conflict-related videos, provide a glimpse of the content young users may encounter on the app, as well as a test of the guardrails TikTok sets for what videos it shows them—and in what volume.
While I’ve long been active on social media, I’ve largely ignored TikTok. Not only is it clearly aimed at people way younger than me, I tend to prefer text-based content or videos longer than a few seconds. (That I’m a Federal employee and successive administrations have been warning about the Chinese-owned company for years further dissuades me from spending time on the app.)
But it’s not shocking that, to the extent TikTok serves videos about the war in Gaza, they’re going to be predominantly anti-war and, thus, anti-Israel. Not only is that point of view likelier to be more popular with young users but it’s a lot easier to convey in short videos than the pro-Israel side. And, to the extent the Chinese government is using the app as part of its larger information campaign, inflaming American public opinion against official U.S. foreign policy is in their interest.
Regardless, TikTok is pushing back against the study:
A spokeswoman for TikTok said the Journal’s experiment “in no way reflects the behaviors or experiences of real teens on TikTok.”
“Real people like, share and search for content, favorite videos, post comments, follow others, and enjoy a wide-range of content on TikTok,” the spokeswoman said in a written statement.
I expect that’s right. My guess is that the vast majority of very young viewers are far more interested in dance crazes, memes, fashion, and cute things on sale at Target* than they are in the Gaza conflict. And TikTik is likely to serve them mostly what they’ve demonstrated an interest in to keep them scrolling.
I’m also a bit perplexed by the study’s focus on 13-year-olds. If I were worried about, say, sexually explicit or violent content, then very young users would indeed be my concern. But given that the subject of this inquiry is the app’s influence on foreign policy views, wouldn’t voting-age users be more interesting?
While I’m having trouble finding up-to-date statistics, a 2021 study found that “42% of TikTok’s users were between the ages of 18 to 24, and 17% of its audience were between 13 and 17 years old.” An otherwise paywalled report tells me that,
As of October 2023, it was found that 18.2 percent of TikTok’s global audience were women between the ages of 18 and 24 years, while male users of the same age made up approximately 18 percent of the platform’s audience. The online audience of the popular social video platform was further composed of 16.3 percent female users aged between 25 and 34 years, and 16.6 percent of male users in the same age group.
So, the 18-34 cohort is not only more interesting in terms of political influence it also happens to be the app’s core demo., comprising 69.1 percent of their users. (Potentially more if their included gender categories beyond male and female.)
The Journal set one of the accounts to a restricted mode, which TikTok says limits content that may not be suitable for all audiences. That didn’t stop the app from inundating the user with war. Soon after signing up, the account’s feed was almost entirely dominated by vivid images and descriptions of the conflict, which began on Oct. 7, when Hamas militants crossed the border and killed about 1,200 people in Israel.
TikTok’s algorithms are uniquely powerful at picking up which videos get users’ attention and then feeding them the most engaging content on the topic. “You don’t have to doomscroll—you can just sit and watch and let the platform do the rest,” said Emerson Brooking, who studies digital platforms as a resident senior fellow at the Digital Forensic Research Lab at the Atlantic Council, a think tank based in Washington, D.C.
Most of the videos TikTok served the Journal’s accounts don’t appear to violate the app’s community policies banning promotion of terrorist groups or content that is gory or extremely violent. But hundreds of the videos described death or showed terrified children. Many of them are difficult to verify.
War-related videos served at least 14 times to the bots are now marked as sensitive and sometimes blocked from being shown to younger users. Hundreds of others were later unavailable on TikTok, either because TikTok removed them or the poster took them down or made them private—but not before they were served to the Journal’s accounts.
“It’s not normal for any adult to see this amount of content, but for kids, it’s like driving 100 miles an hour without any speed bumps, being constantly inundated with demoralizing, emotional content,” said Larissa May, founder of #HalfTheStory, an education nonprofit focused on adolescent digital well-being and the influence of technology on mental health.
TikTok said its family-control features let parents filter out keywords and restrict searches for a child’s account. It also says that since Oct. 7 it has prevented teen accounts from viewing over 1 million videos containing violence or graphic content.
Of the Journal’s eight test accounts, five fell into rabbit holes within 100 videos after the first conflict-related video appeared. (A bot is considered to be in a rabbit hole if more than half of videos—a rolling average that includes the previous 25 and next 25 videos—are conflict-related.) Two others hit that point within the first 250 videos.
Again: the study intentionally goaded the app to display war-related videos, which it likely would not have done organically. Still, to the extent that a given user (regardless of age) sought out war-related content, the results are interesting.
Similarly to other social-media platforms, much of the war content TikTok served the accounts was pro-Palestinian—accounting for 59% of the more than 4,800 videos served to the bots that the Journal reviewed and deemed relevant to the conflict or war. Some 15% of those shown were pro-Israel.
The spokeswoman for TikTok said the platform doesn’t promote one side of an issue over another.
TikTok served the Journal’s accounts videos from one pro-Palestinian account more than 90 times. The account, @free.palestine1160, has no bio but includes a link to a Qatar-based charity’s fundraising page for emergency relief and shelter in Gaza.
The charity didn’t respond to a request for comment.
While pro-Palestinian content was more common, the feeds were also interspersed with pro-Israel videos, including dozens from the Israeli military and government.
“I saw little kids who were beheaded. We didn’t know which head belonged to which kid,” an aid worker said in one video TikTok served the accounts. After being seen by a Journal bot, the video was later removed.
Hundreds of the videos TikTok showed the Journal accounts evoke death without showing it directly. One video described mass graves in Gaza and watching “the bodies of children being stored in ice-cream trucks because the morgues are so full of the dead.”
Content can still be traumatizing to children even if it isn’t visually graphic.
“No child should be watching video after video of kids in war for hours a day,” May said. “Being hit with this content day after day starts to prune kids’ ability to emote and process what they are seeing, and they just start being apathetic.”
Again, the very nature of the algorithm is that it’s going to serve up more attention-getting content. And it’s simply much easier to convey the Palestinian side—the war is awful, killing children in mass numbers—than the Israeli side—the killing is targeted at Hamas militants who are violating the laws of war by hiding among innocents.
Then again, as noted in a somewhat-related Guardian report (“TikTok moderators struggling to assess Israel-Gaza content“),
In a press release issued last month, TikTok said that more young people supported Palestine. “Attitudes among young people skewed toward Palestine long before TikTok existed,” the platform said. “Support for Israel (as compared to sympathy for Palestine) has been lower among younger Americans for some time. This is evidenced by looking at Gallup polling data of millennials dating as far back as 2010, long before TikTok even existed.”
Indeed, to a large extent, world opinion has slowly turned against Israel going all the way back to the first Intifada. The extent to which TikTok, or social media more broadly, is exacerbating that trend is really hard to disagregate.
*I may well be extrapolating too heavily based on what my 20-year-old views on the app, or at least talks about with us.