U.S. authorities and their allies identified and took down a Russian artificial intelligence-powered bot farm consisting of roughly 1,000 accounts that were spreading disinformation and pro-Russian sentiment on X. The Justice Department revealed that the software-enabled scheme was created by a digital media division within Russian state-run media outlet RT. Its development was led by RT’s deputy editor-in-chief in 2022 and appears to have been approved and funded by officers in Russia’s Federal Security Service, the main successor to the KGB.
Cybersecurity advisories issued by the FBI, Dutch intelligence and Canadian cybersecurity authorities specifically mention a tool called “Meliorator,” which can mass-create “lifelike social media personas” that generate text messages and images and spread disinformation from other bot personas. Authorities seized two domains that were used to create email addresses needed to register accounts on X (formerly Twitter), where the bots were based.
However, the Department of Justice is in the process of finding all 968 accounts used by Russian actors to spread disinformation. X has shared information about all accounts identified so far with authorities and has already suspended them. As the Washington Post points out, the bots were able to bypass X’s safeguards by copying and pasting one-time passwords from email accounts to log in. The Department of Justice said the operation’s use of U.S.-based domain names violates the International Emergency Economic Powers Act. Meanwhile, paying them violates U.S. federal money laundering laws.
Many of the profiles created by the tool posed as Americans by using American-sounding names and setting X’s location to various locations in the United States. In one example provided by the Department of Justice, the profile photo featured a mugshot against a gray background, a pretty good indication that it was created using AI. An account named Ricardo Abbott, claiming to be from Minneapolis, posted a video of Russian President Vladimir Putin justifying Russia’s actions in Ukraine. Another account named Sue Williamson posted a video of Putin saying the war in Ukraine was not a territorial dispute but a matter of “principles on which the New World Order is founded.” These posts were then liked and reposted by other bots in the network.
It’s worth noting that while this bot farm was limited to X, the people behind it had plans to expand to other platforms, according to authorities’ analysis of the Meliorator software. Foreign actors spreading political disinformation have been using social media to spread fake news for years, but now they’ve added AI to their arsenal. In May, OpenAI reported that it had dismantled five covert influence operations originating from Russia, China, Israel and Iran that were using its models to influence political outcomes.
“Russia used this bot farm to spread AI-generated foreign disinformation and expand its AI-assisted operations in an effort to undermine our partners in Ukraine and influence geopolitical opinion in the Kremlin’s favor,” FBI Director Christopher Wray said in a statement. “The FBI is committed to working with our partners to strategically disrupt our most dangerous adversaries and their exploitation of cutting-edge technologies by conducting joint, coordinated operations.”
Meanwhile, RT told Bloomberg that “farming is a beloved pastime for millions of Russians.”