Bot farms, AI & Online Trust: What This Means For The Financial Sector 

Written By :

Category :

fintech insights

Posted On :

Share This :

In November 2024, we gave you an in-depth look at Accenture Song’s “Life Trends” report from a global network of 24,295 “designers, creatives, technologists, sociologists, and anthropologists across 50+ design studios and creative agencies.”

This report uncovered a worrying situation for online marketing and those responsible for building and maintaining digital trust for businesses:

  • AI tools, bots, and deepfakes are causing a loss of trust and increasing the “cost of hesitation” because of the amount of fake content online. 
  • 59.9% of people are questioning the authenticity of online content more than before
  • 52% of people have seen fake news or articles
  • 38.8% have seen fraudulent product reviews online
  • 52.8% often or always question the authenticity of product reviews when they see them
  • A Getty Images report found that 87% of people prefer real images to AI-generated images
  • 52% have experienced deep-fake attacks or scams for personal information and/or money
  • A Google Deepmind report has found that “the Asia-Pacific region saw a 1530% rise in deep-fake cases from 2022 to 2023—the second largest rise in the world, behind North America.” 

As a result, people are eager to return to a more pre-digital existence, or at the very least, rely on digital less and experience the real world a lot more. 

Here’s what the report is saying people are doing more of in the last 12 months:

  • 48% Spending time outdoors/in nature
  • 47% Hanging out with friends in real life
  • 46.9% Shopping in physical grocery stores
  • 36% Shopping in other retail stores (non-grocery)
  • 30.1% Reading physical books or magazines

If the last 10 years have shown us hyper-digital-connectivity and hyper-digitalisation, then the next 10 years could see an unbundling of that. 

Digital will always remain part of our lives, businesses, personal, and professional interactions, but perhaps in-person interactions are going to make a 90s-style comeback. 

One of the reasons for this ⏤ alongside the erosion of online trust ⏤ is the fact that 50% of web traffic is AI/bots, according to the 2024 Imperva Bad Bot Report.

The report says that: “For the fifth consecutive year, the proportion of web traffic associated with bad bots grew to 32% in 2023, up from 30.2% in 2022, while traffic from human users decreased to 50.4%.”

One writer, Eric Schwartzman, has spent the last 6 months investigating this further, and is currently working on a book on this topic, Invasion of the Bot Farms.

In an article published in Fast Company, Schwartzman shows us how corrupted the world of online popularity, “going viral”, and mass manipulation has become. 

This interests us because it directly relates to how we work with and support financial service companies, alternative asset, ESG, and FinTech firms. 

So, let’s take a closer look at his findings and then how and why this matters to your business. 

Bot Farms, Deepfakes, AI-Generated Public Sentiment & Financial Manipulation at Scale 

Here’s the scary part upfront: Bot farms can manipulate online engagement to make specific topics ⏤ chosen by those who pay for bots to do all of this ⏤ seem more important. Artificially inflating lies, deception, and manipulation. 

AI bots are actively being used to manipulate the public, what we think, care about, get angry or upset about, buy, and how we invest. 

Using millions of fake accounts, these AI-powered bots can pass the Turing test, which means they can escape detection. 

And the big social platforms, like Facebook, Instagram, and even LinkedIn, know that this “coordinated inauthentic behavior” can and does fool their algorithms. 

A bot farm is a call-center-like operation consisting of hundreds, if not thousands, of phones and devices, all controlled by one AI-powered computer. 

As Schwartzman says: “The bot farm broadcasts coordinated likes, comments, and shares to make it seem as if a lot of people are excited or upset about something like a volatile stock, a global travesty, or celebrity gossip—even though they’re not.” 

These bot farms are increasingly sophisticated. Using SIM cards, mobile proxies, and IP geospoofing, these are not the crude AI bots that used to flood Twitter and Facebook in the pre-Brexit and Trump era

No, these AI bots have evolved, learned from their Russian-funded digital ancestors. Although, as numerous investigations have proven, it’s unlikely that Brexit or Trump’s first term as US President would have happened without an army of bots and the illegal work of Cambridge Analytica.

That’s what much simpler versions of these same fake profiles helped achieve in 2016. Now, in 2025, governments, the media, society, and businesses face more effective and dangerous threats. 

Adam Sohn, CEO of Narravance, a social media threat intelligence firm with major social networks as clients, said: “It’s very difficult to distinguish between authentic activity and inauthentic activity. It’s hard for us, and we’re one of the best at it in the world.”

Bot farms and the manipulation of financial markets 

In the article, Schwartzman cites a historic, financial sector example to show that this is nothing new. Even with numerous safeguards, financial services are open to manipulation. If a stock can be manipulated, then someone is always set to gain. 

In the run-up to the 1929 crash and the Great Recession, one of the ways ​​that President JFK’s father (a notorious financier and bootlegger turned philanthropist, and one-time Ambassador to the UK, with Nazi leanings before the war), Joseph P. Kennedy, enriched himself further through the shameless manipulation of public sentiment and financial markets. 

Using telegrams, letters, phone calls, and fake newspaper reports, Kennedy and a pool of wealthy investors used the “pump and dump” method to dramatically and artificially increase the value of Radio Corp. of America shares. 

Naturally, Kennedy and his friends made a killing. He was rewarded with the chairmanship of the SEC by President Franklin D. Roosevelt in 1934.

Joseph P. Kennedy: A master manipulator of financial markets. No bot farms needed in 1929!

These days, it’s AI-powered bots manipulating people into buying stocks. Across Reddit, Discord, and X, fake accounts are churning out millions of messages connecting stock ticker symbols with slang signals like “c’mon fam,” “buy the dip,” “load up now”, and “keep pushing.”

As Adam Wasserman, CFO of Narravance, said in the article: “There’s no technical indicator. There are just bots posting things like ‘this stock’s going to the moon’ and ‘greatest stock, pulling out of my 401k.’ But they aren’t real people. It’s all fake.”

An example of this is a group of financial influencers, or “finfluencers”, who used those tactics known as the Goblin Gang. They profited by urging others to buy thinly traded, low-float stocks through social media manipulation. 

How do bot farms work?

Based on his research, Schwartzman goes on to say: “In a world where all information is now suspect and decisions are based on sentiment, bot farm amplification has democratized market manipulation. But stock trading is only one application. Anyone can use bot farms to influence how we invest, make purchasing decisions, or vote.”

“These are the same strategies behind propaganda efforts pioneered by Russia and the Islamic State to broadcast beheadings and sway elections. But they’ve been honed to sell stocks, incite riots, and, allegedly, even tarnish celebrity reputations. It’s the same song, just different lyrics.”

Social networks aren’t helping this situation. If anything, they’ve either given up or are complicit in online manipulation:

Using his research, Schwartzman explains how this works, which we’ve summarised below:

  • Commercial bot farms, usually based in countries like Vietnam, promote “growth marketing services” or “social media marketing panels” on platforms like Fiverr and Upwork.
  • Businesses, consumers, and let’s say unfriendly governments can buy these services for a penny per action, like commenting, giving a like, following accounts, watching a video, and visiting websites. 
  • These AI bot accounts can sit dormant within online communities before they’re activated.
  • Upon activation, the AI-powered pre-programmed workflows are designed to make them appear real, thereby fooling algorithms. 
  • Bots are programmed to post within natural time zones. Unlocking phones and setting the clock to EST or PST means that it will post in US-English and within US time zones. 
  • Posts are meant to appear natural. One way is through the posting of spam, known as copypasta, which is repeated.
  • This has become increasingly more sophisticated thanks to AI-based large language models (LLMs) like ChatGPT, Gemini, and Claude. Now, AI bots that have integrated into online communities can quickly and naturally sound like “a 35-year-old libertarian schoolteacher from the Northwest, or a MAGA auto mechanic from the Dakotas.”
  • AI image creation is further enhancing this. In every way, shape, and form, AI bots are getting smarter, more deeply integrated, and this manipulation is everywhere. 

Schwartzman’s article cites numerous examples of proven bot-based traffic, chatter, and manipulation across other share price surges, celebrity reputational manipulation, and in wars and conflicts, in Ukraine, Israel, and worldwide. 

Now, the question is: How does this impact our company, and what can we do about it?

Image source.

Key Takeaways: Bot farms, AI manipulation, and financial services 

These problems aren’t easy to solve, and it’s up to governments and social networks to play an active role in solving these issues. 

However, there are a few things that are worth considering for financial service companies. Don’t worry if your website and social media channels aren’t getting a lot of traffic, clicks, and engagement.

Most likely, that means your website and channels are safe from an infestation of bots. This might sound counterintuitive, but a sudden surge in traffic or a post “going viral” is not necessarily a good sign in these bot-infested times. 

Genuine engagement is the ONLY metric worth monitoring. A few of the best ways we can think that demonstrate the right people are viewing your content are newsletter sign-ups, sales leads, and report downloads. 

Use bot-proof sign-up and download forms, like DataDome for your website, to prevent bots from joining mailing lists and getting inside your systems. 

It’s also worth using anti-virus software and anti-bot sentiment analysis solutions across all of your channels, website, and newsletter to prevent a bot infestation. No company is immune, especially in the financial services sector. 

We can’t ignore the threat of bots, AI, deepfakes, and bot farms. 

But we can focus our efforts where it will make a positive impact. 

Now more than ever, succeeding with online marketing, content, and brand building means giving your audience, customers, and potential customers the following:

  • An easy way to verify that anything online using your name/logo is authentic and genuine 
  • A reason to trust and keep trusting your brand
  • In-person events and high levels of customer service 
  • Video interviews and more in-depth content. 
  • Humanity

Your customers, stakeholders, and potential customers want human, real-life, in-person events and high-quality (clearly real, authentic) content, like video interviews and reports A LOT more than cheap, AI-generated social media posts, or other pieces of content.

Want a Free Tailored Sample & Marketing Analysis?

Now more than ever, alternative asset firms need to stand out and get noticed. You need investors to clearly see the advantages of investing in your offering. 

You can’t sit out another cycle waiting for investors to notice you. We can help you with that. 

So you can see what we can do, we are offering a free, no obligation marketing analysis and a bespoke sample of copy. 

Sounds interesting? Email us at: admin@fintechcontent.marketing and we will get started for you.