THE ‘MOST EXTENSIVE COVERT PRO-WESTERN INFO OP’ ON SOCIAL MEDIA

Declassified Australia exposes and analyses a massive secret propaganda operation being run out of the US, that has been buried by Western media.
Targeting Russia, China, Iran, Central Asia, and other Middle East countries, the US military's Information Operation to spread propaganda is the most extensive program of covert pro-Western Information Operations on social media ever revealed. (Image: Stanford Internet Observatory)

A covert online propaganda operation said to be the world’s largest promoting ‘pro-Western narratives’ has been found to be operating primarily out of the United States, targeting Russia, China, and Iran.

‘We believe this activity represents the most extensive case of covert pro-Western IO [Information Operation] on social media to be reviewed and analysed by open-source researchers to date,’ say the researchers from Stanford University and internet research firm, Graphika.

The researchers found most of the Information Operation ‘likely originated in the United States’. From there it ran a massive, interconnected web of automated ‘bot’ accounts on Twitter, Facebook, and other social media platforms.

The covert operation to influence online audiences has been using ‘deceptive tactics to promote pro-Western narratives’, while ‘opposing countries including Russia, China, and Iran’.

‘The accounts heavily criticized Russia in particular for the deaths of innocent civilians and other atrocities its soldiers committed in pursuit of the Kremlin’s “imperial ambitions” following its invasion of Ukraine in February this year,’ the report says.

Declassified Australia is publishing here a detailed analysis of the remarkable report by Stanford University’s Internet Observatory (SIO) and network analysis firm Graphika, released on August 24. 

This report is all the more surprising because SIO and Graphika have deep connections to the US national security state and to information campaigns against US-designated enemies Russia, China and Iran.

SIO’s director, Alex Stamos, for example, is a member of the Council on Foreign Relations, a visiting scholar at the Hoover Institute, and on the advisory board of NATO’s Collective Cybersecurity Center of Excellence. He was chief security officer at Facebook where he led the company’s investigation into alleged Russian manipulation of the 2016 US election.

Graphika’s director of investigations, Ben Nimmo, is a senior fellow with the Atlantic Council, was a consultant to the UK’s Integrity Initiative propaganda unit, previously worked as press officer for NATO, and is now intelligence chief at Meta (which owns Facebook and Instagram). There he produced a report that attempted to link Labour leader Jeremy Corbyn to a Russian influence operation prior to the 2019 UK general election.

Ironically titled ‘Unheard Voice: Evaluating five years of pro-Western covert influence operations’, the report has been studiously ignored by almost all in the Western establishment media since its release last month. Despite the scale and targeting of the propaganda operation, the spectacular revelations have received only scant attention like this slight mention in the Sydney Morning Herald.

A light-hearted column in the Washington Post referred to it as a ‘splashy report’, saying some of the covert US accounts had posted ‘cat pictures’ in order to appear authentic. The column references Russian and Chinese cyber-espionage operations in framing its remarks about the report.

Part of the reason the report has been effectively buried may be because the report was conveniently overshadowed on the very same day of its release by another Stanford Internet Observatory release titled, ‘A Front for Influence: An Analysis of a Pro-Kremlin Network Promoting Narratives on COVID-19 and Ukraine’. 


The massive Twitter-Meta dataset

The data analysed by Stanford-Graphika came after Twitter and Meta/Facebook in July and August 2022 removed two overlapping sets of fake accounts for violating their terms of service. The datasets appear to cover a series of covert campaigns over a period of almost five years rather than one homogeneous operation. 

The Twitter dataset covered 299,566 tweets by 146 accounts, while the Meta dataset focussed on 39 Facebook profiles and 26 Instagram accounts. Twitter said the accounts breached its policies on ‘platform manipulation and spam,’ while Meta said the assets on its platforms engaged in ‘coordinated inauthentic behavior’.

Community network map of followers of covert clusters’ fake Twitter accounts in Iran, Afghanistan, Iraq, Saudi Arabia region. Colours represent major community groupings. Distance reflects network proximity, with accounts appearing close to those they follow and that follow them. (Image: Stanford Internet Observatory-Graphika)

The Stanford-Graphika report at one point tended to hose down the reach and influence of the fake accounts: ‘The vast majority of [the hundreds of thousands of] posts and tweets we reviewed, received no more than a handful of likes or retweets, and only 19 percent of the covert assets we identified had more than 1,000 followers.’

While this may give critics of the report a thread to cling to, hundreds of fake accounts with thousands of followers is certainly substantial. Elsewhere in the report, the operation is described as ‘the most extensive case of covert pro-Western IO [Information Operation] on social media to be reviewed and analysed by open-source researchers to date’.

The researchers did not identify which US entities were running the program, however did note that: ‘The accounts sometimes shared news articles from US government-funded media outlets, such as Voice of America and Radio Free Europe, and links to websites sponsored by the US military.’

Amongst the data analysed, two distinct disinformation campaigns were identified. One is a previously exposed disinformation campaign run by the Pentagon, while the second comprises a previously unknown series of covert operations of unspecified origin. 

The Stanford-Graphika researchers found in the datasets one campaign to be ‘linked to an overt US government messaging campaign called the Trans-Regional Web Initiative’. The first evidence of this program came from Washington-based thinktank, the Stimson Center, in 2012 – their report is now off-line but is archived here.

This ‘Web Initiative’ influence program has been run by the US military’s elite Special Operations Command (SOCOM) during the 2010s, deploying dozens of Military Information Support Operations (MISO) teams on psychological operations around the world at the request of military commanders in the field, and ambassadors in a range of US embassies.

SOCOM had contracted some of its development work on the multimillion dollar disinformation operation to the Rendon Group, a CIA-linked contractor notorious for influencing public opinion and Western media before the start of the Iraq War in 2003. 

SOCOM’s influence operation included preparing websites that offer news, cultural reports, sports and other programming to ‘target audiences’, such as Southeast Europe Times and Central Asia Online. The websites ‘have the strong appearance of civilian journalism’ and seek ‘to express the United States and its operations in a positive light’.


Covert clusters exposed

The newly-revealed covert clusters of the Information Operation (IO) received closer examination by the Stanford-Graphika report, who identified it as ‘the most extensive case of covert pro-Western IO on social media’ so far examined by open-source researchers. 

The covert pro-Western fake accounts identified by Twitter and Meta had ‘created fake personas with GAN (Generative Adversarial Network-computer-generated) faces, posed as independent media outlets, leveraged memes and short-form videos, attempted to start hashtag campaigns, and launched online petitions’.

Through social media network mapping, the covert Twitter accounts were targeting Middle East audiences primarily in Iran (45%), Afghanistan, Iraq, and also Central Asia. Analysis also found ‘smaller community clusters in the network containing mixed international accounts focused loosely on a variety of international figures and organizations’. 

One profile picture in the Central Asia cluster used a doctored photo (left), of actor Valeria Menendez (right) as its profile picture. The asset that used this image was listed as the contact for Intergazeta’s VK page. (Image: Stanford Internet Observatory-Graphika)

Some of the covert accounts targeted regions of Russia and China. ‘The operation targeted Russian-speaking Central Asian audiences and focused on praising American aid to Central Asia and criticizing Russia, particularly its foreign policy. Two assets concentrated on China and the treatment of Chinese Muslim minorities, particularly the Uighurs in Xinjiang province.’

The researchers found the disinformation clusters focused on several pro-Western topics, being ‘US diplomatic and humanitarian efforts in the region, Russia’s alleged malign influence, Russian military interventions in the Middle East and Africa, and Chinese “imperialism” and treatment of Muslim minorities.’


The Russia cluster

Ukraine became the focus of much of the covert Twitter operation from February. The researchers found ‘assets that previously posted about Russian military activities in the Middle East and Africa pivoted towards the war in Ukraine, presenting the conflict as a threat to people in Central Asia’. 

‘Shortly after the invasion began in February, accounts promoted pro-Ukrainian protests in Central Asian countries. Later posts reported on evidence of atrocities committed by Russian troops and Russia’s block on Ukrainian grain exports.’

The covert campaign, citing Russia’s ‘imperial’ ambitions, presented the US as ‘the main guarantor of Central Asia’s sovereignty against Russia’.

‘Other posts criticized Russia’s use of propaganda to spread anti-West and pro-Russia narratives in Central Asia, depicting Russia as a nefarious actor working to undermine independent democracies.’ 

Posts suggesting Russia will use ethnic minorities to fight in Ukraine (left), and the deadly result of conscription of Central Asian migrants into the Russian military (right). (Image: Stanford Internet Observatory-Graphika)

The covert operation established ‘fake personas’ linked to ‘sham media outlets’, purporting to report news from events in Central Asia. Several of these sites and pages attracted as many as 6,000 followers. 

Facebook transparency data showed administrator locations of four of the fake pages as being in France, but Meta analysis found them to be actually ‘originated in the US’. Several pages posted pictures of Paris and its monuments in an attempt to obfuscate the true US origins.

Several of the fake ‘news’ sites, such as Intergazeta and Vostochnaya Pravda, translated content into Russian from the websites of the BBC Russian Service, US embassies in Central Asia, and US-funded Radio Free Europe. They also often obtained content from media outlets directly sponsored by the US Central Command, particularly Caravanserai.

At least four of the sham media outlets ‘made apparent attempts to launch hashtag campaigns related to the war in Ukraine’. One site posting about the Russian invasion of Ukraine used the not-so-subtle hashtag translated as #TodayUkraineTomorrowCentralAsia. The report’s audience analysis found these attempts did not gain significant traction.


The China cluster

The researchers found a small cluster of assets of the Central Asia group was focused almost exclusively on China. ‘These accounts – a fake persona and sham media outlet – mainly focused on the genocide of Uighurs and Muslim minorities in “re-education” camps in Xinjiang.’ 

Posts described ‘alleged organ trafficking, forced labor, sexual crimes against Muslim women, and suspicious disappearances of ethnic Muslims in Xinjiang’. Other assets in the group also posted about China, asserting that ‘Chinese authoritarianism and financial imperialism threatened Central Asia and other regions of the world’. 

Posts about alleged organ harvesting of Muslims in Xinjiang (left), and China being blamed for being the main sponsor of Russia’s war against Ukraine (right). (Image: Stanford Internet Observatory-Graphika)

The covert social media assets ‘frequently referred to China’s cooperation with Russia, especially on military issues, and said Beijing should be held responsible for Russia’s invasion of Ukraine because the CCP had secretly supplied the Kremlin with weapons.’ 

This fake narrative of China supplying Russia with weapons for the Ukraine war was also spread in the West, but it was quickly debunked, and now even the Ukraine Military is admitting it was a fake story.


The Iran cluster

Fake accounts in the Iran cluster ‘frequently claimed to be Iranian and often Iranian women [with] listed professions such as “teacher” and “political activist”.’

Some of the fake Persian-language media outlets showed a certain flare. The tagline of YouTube channel Fahim News is ‘Accurate news and information’. Dariche News outlet claims to be proving ‘uncensored and unbiased news’, and declares it is ‘an independent website… unaffiliated with any group or organization’.

Material for the fake Iran outlets is sourced from US-funded Persian language sites, but also from the UK-based TV station, Iran International, reported to be funded by a businessman with ties to Saudi crown prince Mohammed bin Salman.

On 18 August 2022, a post from the fake Fahim News outlet said that social media is the only way Iranians can access the free world and is the main enemy of the Iranian regime’s propaganda. ‘Therefore, the regime uses all its efforts to censor and filter the Internet.’

Disinformation peddlers certainly like to play a double-game. 

Accounts in the Iran cluster exhibited some spammy characteristics likely geared towards building a large online audience. Many accounts ‘posted non-political filler content’ including Iranian poetry, photos of Persian food, and even cute photos of cats.

‘We observed multiple instances of accounts in the Iran group sharing content from sources linked to the U S military.’ Perhaps clumsily, one Twitter account that presented itself as ‘an Iranian individual living in Cambridge’ posted links to Almashareq and Diyaruna, two Persian-language news websites sponsored by the US Central Command. 

This image was tweeted by a fake asset on Feb. 24, 2022, and says, “Freedom of speech in Iran”. Text accompanying the tweet used two Persian hashtags, one protesting an internet control bill, and the other saying “No to the Islamic Republic”. (Image: Stanford Internet Observatory-Graphika)

The Iran cluster also focused on an irritation point for the Iranian government – women’s rights. “Posts also noted that little has changed for women in Iran over time. Many posts highlighted domestic protests against hijab dress requirements.’


The Afghanistan cluster

A smaller number of Afghanistan assets was found to be using similar techniques to the other clusters such as AI-created profile images, fake news sites, and information from US sources.

The sites ‘consistently advanced narratives critical of Iran and its actions’. ‘Sometimes these narratives included inflammatory claims accompanied by articles from the US military-linked website afghanistan.asia-news.com.’ 

The report cites a provocative example: ‘A tweet from March 11 2022 which claimed that relatives of deceased Afghanistan refugees had reported bodies being returned from Iran with missing organs.’ The article includes interviews with a purported Afghan official and Afghan nurse making the same unverified claims.

Since the fall of Afghanistan to the Taliban in August 2021, the fake sites have ‘highlighted women’s protests against Taliban authorities and criticised Afghanistan’s new government for its treatment of women and journalists’.


The Middle East cluster

The Middle East cluster used its covert assets to focus on issues primarily related to Iraq, Syria, Lebanon, and Yemen. This group ‘chiefly promoted narratives seeking to undermine Iran’s influence in the region’. It did this through a spread of inflammatory allegations and stories designed to influence audiences.

Several fake Twitter accounts ‘posed as Iraqi activists in order to accuse Iran of threatening Iraq’s water security, and flooding the country with crystal meth’. ‘Other assets highlighted Houthi-planted landmines killing civilians, and promoted allegations that Russia’s invasion of Ukraine will lead to global food crisis.’

Computer-generated profile pictures used by fake Twitter accounts of the Middle East cluster. (Image: Stanford Internet Observatory-Graphika)

Multiple assets posted similar content, similarly timed, that was clearly being shared and coordinated. Some sloppy operational security by the operators of the Middle East cluster is also on display. 

One Twitter page pretending to be of an Iraqi man named ‘Discoverer’ and using a fake AI-generated profile photo, posted predominantly about misdeeds of the Iranian government. However archived versions of the Twitter account show that prior to May 2021, it used a different profile photo, identified as an ‘account belonging to the US Central Command’, and listed its location as ‘Florida, USA’.

Incidentally, Florida is home to the headquarters of the US Central Command or CENTCOM, located at MacDill Airforce Base in Tampa, Florida. Coincidentally, CENTCOM’s Area of Responsibility (AOR) extends across the Middle East, Central Asia, and parts of South Asia, the same patch of the globe as covered by the Information Operations reported here.

Notably, CENTACOM states it uses Information Operations (IO) campaigns that ‘include counter-propaganda messaging… in internet and social media’. These IO campaigns serve ‘as a force multiplier in the information space… to counter state-sponsored destabilizing activities across the CENTCOM AOR’.


Attributing the ‘Information Operation’ 

Determining with absolute certainty the identity of the origin of these unprecedented influence operations described here is, according to the report writers, not possible.

However, using a standard of proof used by one of this report’s researchers, Graphika, to inculpate Russia in a previous operation of an influence campaign, it is surprising they could not reach a stronger conclusion.

In describing a leak of trade documents that threatened to benefit the UK Labour Party in the 2019 UK general election, Ben Nimmo of Grafika studied the leaks and later noted he ‘cannot provide attribution of the operation’, however the report boldly stated that the leaks were:

  • ‘disseminated in a similar fashion to Russian operation Secondary Infektion.’
  • ‘amplified online in a way that closely resembles a known Russian information operation.’
  • ‘the similarities … are too close to be simply coincidence.’
  • ‘the account … made specific errors that were characteristic of Secondary Infektion.’
  • ‘tweeting … resembled earlier amplification efforts by Secondary Infektion.’

The Graphika report was unashamedly titled: ‘UK Trade Leaks: Operators Keen To Hide Their Identityies Disseminated Leaked UK/US Trade Documents In A Similar Fashion To Russian Operation Secondary Infection Exposed In June 2019.’ It was obvious what would happen next.

Not unexpectedly the mainstream media picked up the report, consistently referring to it as a Russian disinformation operation, with headlines such as The Guardian’s ‘Russia involved in leak of papers’, SkyNews’s ‘Leaked documents cited by Corbyn ‘tied to Russia group’,’ and The Telegraph’s ‘Russians tried to interfere in election by promoting leaked trade documents touted by Jeremy Corbyn’.

The implications of such a condemnation helped sink the election campaign of Labour leader Jeremy Corbyn.

Some conclusions about the Stanford-Graphika ‘Unheard Voice’ report can be reached that hold a high level of certainty. In fact the degree of certainty looks to exceed that of Graphika’s ‘UK Trade Leaks’ report.

A tweet by the ‘Discoverer’ Twitter persona, that in a previous incarnation had identified themselves as living in Florida, USA, criticised the actions of Iranian proxies in Iraq and promoted humanitarian efforts promoted by the US government. (Image: Stanford Internet Observatory-Graphika)

It is apparent from its purpose, targeting, narratives, techniques, sources and even some meta-data left in their trail just who the creators of the covert Information Operations identified in the Stanford-Graphika ‘Unheard Voice’ report may be. The providers of the datasets the research is based upon have stated their views – Twitter states the ‘presumptive countries of origin’ for their data is the US and UK, and for Meta ‘the country of origin’ is the US.

The Information Operations, reported in the Stanford-Graphika report and described here by Declassified Australia, can be confidently stated as being operated by groups or individuals affiliated to US military entities, are promoting US military and US imperial aims in the targeted countries. Those targeted countries are all designated enemies of the US and methods and techniques exposed by the report are proven to have been used by US military propaganda units and indeed in some cases the links are direct. Much of their source information is originated from US-funded media sites, US embassies, and US military units, and, finally, metadata crumbs point to the US military.

Two sensitive sources have since spoken anononymously about this ‘most extensive covert pro-Western Information Operation on social media’ saying to the Washington Post that ‘US Central Command is among those whose activities are facing scrutiny’.

There does not seem to be any available evidence against such a conclusion, at this juncture, that this unprecedented Information Operation is a massive covert US military propaganda operation. 

Peter Cronau

PETER CRONAU is an award-winning investigative journalist, writer, and film-maker. His documentaries have appeared on ABC TV’s Four Corners and Radio National’s Background Briefing. He is an editor and cofounder of DECLASSIFIED AUSTRALIA. He is co-editor of the recent book A Secret Australia – Revealed by the WikiLeaks Exposés. View all posts by

PHP Code Snippets Powered By : XYZScripts.com