'Bot campaigns have sought to influence votes related to the UK Brexit movement and elections in France, Germany, and Italy.' Image: REUTERS/U.S. Department of Health and Human Services Office of the Inspector General/Handout via ReutersATTENTION EDITORS - THIS PICTURE WAS PROVIDED BY A THIRD PARTY. REUTERS IS UNABLE TO INDEPENDENTLY VERIFY THE AUTHENTICITY, CONTENT, LOCATION OR DATE OF THIS IMAGE. FOR EDITORIAL USE ONLY. NOT FOR SALE FOR MARKETING OR ADVERTISING CAMPAIGNS. THIS PICTURE IS DISTRIBUTED EXACTLY AS RECEIVED BY REUTERS, AS A SERVICE TO CLIENTS. PICTURE DIGITALLY ALTERED AT SOURCE. - GF10000076649
Explore and monitor how Media, Entertainment and Sport is affecting economies, industries and global issues
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:
Media, Entertainment and Sport
There’s a new tool in the fight against online disinformation called BotSlayer.
The software, which is free and open to the public, scans social media in real time to detect evidence of automated Twitter accounts—or “bots”—pushing messages in a coordinated manner. The use of bots is an increasingly common practice to manipulate public opinion by creating the false impression that many people are talking about a particular subject. The method is also known as “astroturfing” because it mimics the appearance of legitimate grassroots political activity.
By leveraging the observatory’s expertise and technological infrastructure, BotSlayer gives groups and individuals of any political affiliation the power to detect coordinated disinformation campaigns in real time—without any prior knowledge of these campaigns.
“We developed BotSlayer to make it easier for journalists and political campaigns to monitor potential new disinformation campaigns that attempt to manipulate public opinion using bots,” says Filippo Menczer, a professor in the School of Informatics, Computing and Engineering at Indiana University and director of the Observatory on Social Media.
“If there is a suspicious spike in traffic around some specific topic, BotSlayer allows you to spot it very quickly so you can investigate the content and its promoters and, if there appears to be abuse of the platform, report it or communicate to your followers about it,” Menczer says.
Hunting for disinformation
The use of deceptive bots to sway public opinion is a growing issue in politics in the US and internationally, adds Menczer, who is also a part of a group of researchers who found prevalent use of bots in the run-up to the 2016 US presidential election. Other bot campaigns have sought to influence votes related to the UK Brexit movement and elections in France, Germany, and Italy.
During the run-up to the midterm elections in 2018, for example, the Democratic Congressional Campaign Committee used publicly available tools created by the observatory to report over 10,000 bots spreading voter suppression messages to Twitter, which shut down the accounts. The tools used to inform the report were Botometer, which uses an algorithm to assign a score to Twitter accounts based upon the likelihood they’re automated, and Hoaxy, which lets individuals search and visualize the spread of specific topics on Twitter in real time. Botometer is one of the observatory’s most popular tools, currently receiving over 100,000 queries per day.
Researchers created BotSlayer, which combines technology from Hoaxy and Botometer, in part based on feedback from political and news organizations asking to make faster, more powerful, and more user-friendly tools. These organizations include The Associated Press, The New York Times, and CNN.
The system uses an “anomaly detection algorithm” to quickly report trending activity whose sudden surge is likely driven by bots, Menczer says.
For example, BotSlayer could be useful during a presidential debate to not only instantly detect when a candidate’s username or related hashtags are trending, but also automatically assign a “bot score” to indicate whether the surge appears related to bot activity. In business, BotSlayer could help organizations protect against reputational threats that rely on automated accounts to amplify messages. In journalism, the tool could help monitor against manipulation of reporting on trending topics, or warn the public against disinformation attacks.
Real or fake? Botslayer can tell
In addition to detecting trends, BotSlayer can instantly generate a “network map” that illustrates how a particular topic is spreading over time. A bot score is also assigned to each user in the network, providing an easy way to see the most influential accounts—real or fake—in the conversation. Each trending “entity”—a hashtag; a user handle; an image, video, gif, or meme; or a keyword or phrase—is also assigned a percentage to indicate how quickly it’s surging. A percentage of 5,000 indicates a 50-fold increase in mentions in the past four hours, for example.
To use BotSlayer, users can download the software through an online form and follow simple instructions to install it in the cloud or on their own server. This process can be performed in a few clicks, and at no cost. It lets users personalize their use of the tool and protects privacy. The system is accessed through a web dashboard integrated with the observatory’s other tools.
“BotSlayer has the potential to be a very powerful new tool in our global battle against misinformation and disinformation. During the 2018 elections, we used an earlier version of BotSlayer to identify and ultimately take down malicious accounts throughout the general election. It was very effective and easy to use,” says Simon Rosenberg, a Democratic strategist and senior advisor to the DCCC during the 2017-18 election cycle.
“My hope is that actors in democracies throughout the world take advantage of it as we did for the last election here in the US,” Rosenberg says.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
The views expressed in this article are those of the author alone and not the World Economic Forum.
More on Media, Entertainment and SportSee all
February 20, 2024
January 16, 2024
Ben Feringa and Sir Paul Nurse
January 16, 2024
November 14, 2023
October 9, 2023