When X’s location tool went live, it confirmed what many had sensed: a long list of anti-Israel, antisemitic, and anti-Western accounts were operating from countries different from the ones they claimed.

Muslim Cowboy, for instance, an influencer with over 75 thousand followers and consistent anti-West content, who claimed to be from the “Deep Southwest,” turned out to be from Saudi Arabia. This erases any pretense of truth in statements such as, “We are Americans against [sic] hateful Zionists.” The Israel Exposed account, with 185 thousand followers, is handled from West Asia. “American working in Saudi as the pay and job market is better here,” the user wrote. The anti-Zionist Torah Judaism account is based in the Philippines; CounterAIPAC posts from Egypt.

Networks of accounts from people who claimed to be in Gaza were in fact based in Central Asia, South Asia, and Eastern Europe. One described life in Khan Younis but was located in Pakistan; another account narrating conditions in a northern Gaza displacement camp traced back to Bangladesh.

Fundraising scams coexist alongside propaganda. One account claiming to be a Gazan civilian seeking donations for medical care and evacuation was revealed to be operating from West Africa. Another profile framed as a journalist was posting from Europe while circulating recycled videos. A new analysis by the Israeli government of 500 purported Gaza-based accounts found that only 37.5% matched the location stated in their profiles.

It was also revealed that many far Right “America First” accounts were run from various countries. One example among many is The General, who writes in his bio, “Ethnically American” and “Colonial Stock,” but turned out to be managed from Turkey. He likewise claimed to be an American working abroad. Most of these accounts put out a stream of extreme content, junk news, and conspiracies to exploit and intensify existing socio-political fault lines. The big cultural flashpoints on which Westerners so vehemently disagree – gender and LGBTQ issues, race, immigration, Israel, reproductive rights, Islam, and gun control, to name a few – become the target of online conversations into which they pump divisive, toxic content.

A visual illustration depicts a phone with X/Twitter open with a Palestinian flag in the background.
A visual illustration depicts a phone with X/Twitter open with a Palestinian flag in the background. (credit: Canva, REUTERS, SHUTTERSTOCK)

Beyond the objective of radicalizing public opinion against Israel, more broadly, the goal is to sow domestic discord to ultimately weaken societal cohesion in America and other Western democracies.

The evidence that states are involved in online influence operations is well-documented. Between 2020 and 2025, US intelligence agencies identified Russia, China, and Iran as the three most active sources of foreign disinformation targeting Western democracies, including Israel. Critically, these efforts have evolved from crude, easily identifiable tactics to sophisticated, AI-augmented operations that exploit emotional resonance and algorithmic amplification.

For example, an analysis of two million social media posts written within 48 hours of the October 7 attacks revealed that 25% of accounts engaging in pro-Hamas discourse were fake – posting over 312,000 pieces of content in two days, with some posting over 600 times daily. These fake profiles demonstrated algorithmic sophistication: they embedded pro-Israel hashtags (#IStandWithIsrael, #Israel) in pro-Hamas content to hijack trending conversations and reach broader audiences. 

Another analysis showed that 32% of X profiles engaging with AJ+ (Al Jazeera’s social media publisher) accounts were fake and participating in cross-platform antisemitic propaganda campaigns. These networks redirected users to TikTok to amplify anti-Israel and anti-US narratives, deliberately exploiting algorithmic recommendation systems.

Gaza war used as material for anti-American messaging

What X’s location tool is exposing suggests that these individual fake accounts sit on top of a much larger, more organized architecture. The Institute for Strategic Dialogue has documented how a pro-CCP network known as “Spamouflage” pivoted after October 7 to use the Gaza war as material for anti-American messaging. Operating thousands of fake accounts across X, Facebook, and YouTube, the network pushed three core storylines: that the United States is intentionally fueling the war in Gaza for profit and power, that Palestinian suffering is primarily a US creation, and that a decaying American hegemony is now a threat to global stability. Some accounts posted; many others simply liked and shared in order to simulate organic outrage. This is just one of many examples. Iran, Russia, Hamas and other terror groups, and Qatar, among others, have all been caught running disinformation campaigns.

The point is not that every fake “Gazan” or “America first” profile is run directly from Beijing, Moscow, or Tehran. It’s that the same techniques we now see in the location tool – misleading personas, foreign operators posing as Westerners, emotional Gaza content used as a vehicle – are precisely the methods documented in state-linked campaigns like Spamouflage and others. The emerging picture is a coordinated information ecosystem. The big actors and the sockpuppet accounts are just different layers of the same information war.

The location tool provides some measure of tactical advantage in the fight against malicious, possibly state-backed influence on X. It has likely – if temporarily – disrupted the operations of some of these foreign-run accounts, since they rely on anonymity and location is an important piece of identity information. More importantly, it democratized the ability to spot accounts that are not where they say they are.

This small tactical advantage exists against a much larger strategic threat. Adversaries will adapt, as they always do, and the usefulness of the feature will narrow over time. Even so, it’s a step in the right direction. It makes certain patterns easier to see, but not without costs: it also chips away at anonymity for people who genuinely need it. Whether this tactical advantage can be turned into anything more strategic depends on how platforms and those researching or fighting political disinformation campaigns proceed to use it.

The writer is an independent journalist focusing on extremism, disinformation, and Mideast history and politics, and he is a co-host of the Lappin Assessment podcast.