Russian Trolls Using Social Media To Spread Anti-Vaccine Propaganda
A new report indicates that Russian social media trolls are involved in spreading anti-vaccination propaganda in the United States and elsewhere.
A new report alleges that Russian trolls have utilized social media to spread anti-vaccination propaganda in the United States and elsewhere:
Russian Twitter trolls have attempted to fuel the anti-vaccination debate in the U.S., posting about the issue far more than the average Twitter user last year, a study out of George Washington University has found. The “sophisticated” bots shared opinions from both sides of the anti-vaxxer debate, which took the U.S. by storm and prompted tech companies to crack down on the spread of misinformation surrounding vaccinations.
In the study, professor David Broniatowski and his colleagues say the Russian trolls’ efforts mimic those used in the past. Such trolls ramp up controversial issues in the U.S. by inflating different viewpoints, the study says.
The U.S. is in the midst of the worst measles outbreak in the country in 25 years. Health officials say misinformation and anti-vax messages have led more people to avoid vaccination, allowing the disease to spread.
“These outbreaks are due to the anti-vaccine movement,” Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Disease, told CBS in January, when the outbreaks were beginning to gain steam.
He stressed that the vaccine has been scientifically proven over many years to be safe and effective in preventing measles. However, some parents still refuse to vaccinate their kids.
One of the main reasons anti-vaxxers refuse vaccinations is that they incorrectly believe they cause autism. As part of an effort by several large tech companies to cut out the spread of vaccine misinformation, Amazon began removing books that promote supposed “cures” for autism.
Facebook also said it would crack down on the spread of vaccine misinformation by de-prioritizing medical myths across the platform, taking action against verifiable vaccine hoaxes, the company said. Misinformation will now appear less frequently in News Feeds, both public and private pages and groups, search predictions and recommendations, according to Facebook.
The vaccination “debate” isn’t the only subject that Russian-based trolls appear to be exploiting:
Not only did Russia fuel the anti-vaccination debate, they have also spewed unverified information about 5G wireless technology. RT, a U.S.-based Russia-backed TV network, reported that new 5G technology was linked to cancer, autism, Alzheimer’s and other health issues, The New York Times reports. This had a real-world effect, with smaller blogs and websites picking up RT’s false stories and sharing them as fact, the Times said.
In February 2018, special counsel Robert Mueller charged 13 Russian nationals and three Russian entities with crimes related to a campaign to sow disinformation and division in the U.S. in the run-up to the 2016 election. A so-called “troll factory” in St. Petersburg set up to influence U.S. voters was to blame, according to the indictment.
Given what we already know about how Russia has approached its efforts to interfere in the internal politics of the United States, this news doesn’t really come as a surprise. Back in February of last year, for example, Special Counsel Robert Mueller issued more than a dozen indictments against Russian individuals and companies involved in social media efforts to spread propaganda via social media. This was followed several months later by indictments against 12 members of Russia’s GRU, the military’s intelligence arm, related to the efforts to hack into the email systems of the Democratic National Committee, the Clinton campaign, and other people associated with the Clinton campaign.
From these indictments, we learned that the campaign had started well before 2016 and that it often involved spreading fake news and other propaganda regarding high profile political issues as well as issues related to the campaign itself. Quite often, these Russian trolls would be involved on both sides of a divisive issue. Subsequently, it has been reported that these trolls have popped up on social media using the same techniques they did during the 2016 election. Last February, for example, it was reported that Russian trolls were spreading propaganda regarding the gun control debate in the days and hours after the February 2018 shooting at Parkland High School in Florida. It’s also been reported that many of the anti-semitic attacks that have appeared on social media have been driven by bots, many of them originating in Russia.
In engaging in these campaigns, the Russian trolls, that clearly continue to operate notwithstanding Mueller’s indictments and Russian denials that they even exist, is utilizing a strategy that is rooted in two factors. \
The first, of course, was the ease with which one can establish a social media account and use it broadcast anything you want to potentially millions of people at a time. While Facebook and Twitter have both recently undertaken steps to rid their networks of these types of accounts, such efforts often end up happening after the fact rather than being pre-emptive. Partly that seems to be because it’s not easy for either company to determine beforehand when an account may be illegitimate until it is utilized for such a purpose. The other issue making any effort to police social media is the sheer size of the networks themselves.
Presently, there are roughly 326,000,000 active Twitter users around the world, for example, and they are responsible for some 500,000,000 million messages on the network per day. (Source) For Facebook, the numbers are even more daunting in that there are roughly 2,380,000,000 active users worldwide and they are responsible for an astounding number of posts. Specifically, statistics indicate that there are 510,000 comments, 293,000 statuses posted, and 136,000 every 60 seconds. That amounts to billions of comments, statuses, and photos every single day. (Source) Keeping track of all that is a task that simply can’t be handled by human beings working alone, which is why both companies rely on algorithms and reporting from users, as well as analysis of traffic patterns and the origin of posts, to police the service. These methods aren’t perfect, though, and they obviously allow people using the services for nefarious purposes to go largely undetected for long periods of time. The Russian trolls know this and use it to their advantage. Even when they are discovered and their accounts banned, it is relatively easy for them to set up other accounts and to appear as if they are posting from someplace other than Russia itself.
The second factor that these operations rely upon, of course, is the hyperpartisanship that has become such a prominent part of American politics. The memes and other messages that they broadcast are often meant to directly exploit this hyperpartisanship and to do what they can to make the divide over these issues even worse by pandering to extremes on both the right and the left who are only too happy to share and repost memes, posts, and “news items” that appeal to and stoke that divide.
This, I would submit, is further evidence for the theory I have advanced in the past that the real goal of the campaign that Russia engaged in during the 2016 campaign, and which appears to be continuing to this day, isn’t so much to help one candidate or another or to advance some policy goal inside the targeted country as it is to create chaos and doubts about the durability of democratic institutions by exploiting our own divisions. In that respect, they have succeeded masterfully, and they’ve done so largely because we let him get away with it.