[ad_1]
In May, several French and German social media influencers received an odd offer.
A London-based public relations agency wanted to pay them to promote messages on behalf of a client. A three-page document detailing what to say and on what platforms.
However, she asked influencers to push them, not typically beauty products or vacation packages. The lies that tarnished Pfizer-BioNTech’s Covid-19 vaccine. Still, foreign agency Fazze requested a London address with no evidence of such a company existing.
Some buyers posted screenshots of the offer. Fazze, who emerged, deleted his social media accounts. In the same week, Brazilian and Indian influencers posted videos resonant Fazze’s script reached hundreds of thousands of viewers.
The plan appears to be part of a clandestine industry that security analysts and U.S. officials say is exploding in scale: rental disinformation.
Private firms, nestled in the shadow world of traditional marketing and geopolitical influence operations, sell services that were once mainly run by intelligence agencies.
They sow discord, meddle in elections, seed false narratives, and often set up viral conspiracies on social media. And they offer customers something valuable: deniability.
“Disinformation actors employed by government or by actors adjacent to the government are growing and serious,” said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Laboratory, describing it as “a booming industry.”
Similar campaigns were recently found promoting India’s ruling party Egypt. foreign policy objectives and political figures Bolivia and Venezuela.
Mr. Brookie’s organization monitored One operates in the middle of a mayoral race in Serra, a small Brazilian city. Ideologically confused Ukrainian company supported several rival political parties.
in the Central African Republic, two separate operations flooded social media with dueling pro-French and pro-Russian disinformation. Both powers compete for influence in the country.
A wave of seemingly organic anti-American messages in Iraq monitored To a PR firm that has been separately accused of emulating anti-government sentiment in Israel.
Many range from back-street companies whose legitimate services look like a low-rate marketer or email spammer.
Job postings and employee LinkedIn profiles associated with Fazze identify Fazze as a subsidiary of a Moscow-based company called Adnow. Some Fazze web domains are registered as owned by Adnow. first reported By German outlets Netzpolitik and ARD Kontraste. Third party reviews to portray Adnow as a demanding ad service provider.
European authorities say they are investigating who is holding Adnow. Portions of Fazze’s anti-Pfizer speeches look like promotional materials for Russia’s Sputnik-V vaccine.
While recruited disinformation is only sometimes effective, it becomes more complex as practitioners repeat and learn. Experts say it is becoming more common around the world, surpassing operations conducted directly by governments.
The result is an accelerating rise in polarizing conspiracies, false citizen groups, and fabricated public sentiment, distorting our shared reality beyond even the depths of recent years.
emerged after the trend Cambridge Analytica scandal In 2018, experts say. Cambridge, a political consulting firm affiliated with members of Donald J. Trump’s 2016 presidential campaign, has been found to collect data on millions of Facebook users.
The discussion highlighted methods common among social media marketers. Cambridge used its data to target hyper-specific audiences with tailored messages. He tested what resonated by watching likes and shares.
The department has taught a generation of consultants and opportunists that there is big money in social media marketing for political reasons, and it’s all in the guise of organic activity.
Some newcomers have finally come to the conclusion that Russian agents reached in 2016: Disinformation performs particularly well on social platforms.
At the same time, the backlash against Russia’s influence trade has also demonstrated the power of such operations, while leaving governments wary of getting caught.
“Unfortunately, there’s a huge market demand for disinformation,” Brookie said, “and there are many places in the ecosystem that are willing to exceed that demand.”
Business firms committed rental disinformation in at least 48 countries last year – almost twice as much as in the previous year. Oxford University study. Researchers identified 65 companies that offer such services.
last summer, Facebook removed A network of Bolivian citizen groups and a journalistic fact-checking organizations. Pages promoting lies that support the country’s right-wing state, it was fake.
Stanford University researchers monitored content to CLS Strategies, a Washington, DC-based communications firm registered as a consultant to the Bolivian government. The firm had done similar work in Venezuela and Mexico.
A spokesperson cited the company’s statement last year that the district chief was taking leave, but disputed Facebook’s accusation that the work qualifies as foreign interference.
corrosive reality
.
New technology allows almost everyone to be involved. programs mass production Fake accounts with profile photos that are hard to watch. Instant metrics help improve effective messaging. So is access to personal data of users that are easily purchased in bulk.
Campaigns are rarely as complex as specialized firms such as government hackers or Kremlin-backed companies. Internet Research Agency.
But they look cheap. In countries that mandate campaign finance transparency, firms report tens of thousands of dollars billed for campaigns that include traditional consulting services.
The layer of deniability leaves governments free to sow disinformation at home and abroad more aggressively than it would otherwise be worth the risk. Some contractors, when caught, claimed that they acted without their clients’ knowledge or simply to win future business.
Platforms stepped up efforts to root out coordinated disinformation. Analysts especially rely on Facebook, which publishes detailed reports on the campaigns it has broken.
Still, some argue that social media companies also play a role in exacerbating the threat. Algorithms and design elements that increase engagement favor research findings, often divisive and conspiracy content.
Political norms have also changed. A generation of populist leaders like Rodrigo Duterte from the Philippines rose in part through social media manipulation. After taking office, many institutionalize These methods as tools of governance and external relations.
In India, dozens of government-run Twitter accounts shared posts from India Vs Disinformation, a website claiming fact-checking news about India, and a series of social media feeds.
India Vs Dezenformation is actually the product of a Canadian communications firm called Press Monitor.
Almost all posts are aimed at discrediting or defaming reports against Prime Minister Narendra Modi’s government. Covid-19 toll. An associated site promotes pro-Modi narratives under the guise of news articles.
Digital Forensic Research Laboratory statement Researching the network, he called a “significant case study” in the rise of “disinformation campaigns in democracies”.
Press Monitor representative, who will only identify himself as Abhay, described the report as completely false.
He only stated that he misidentified his firm as being based in Canada. Why was the company asked? lists He has a Toronto address, a Canadian tax registration, and is described as “part of Toronto’s emerging technology ecosystem” or has jobs in many countries if he says why he was contacted by his Toronto phone number. He did not respond to an email asking for clarification.
A LinkedIn profile For Abhay Aggarwal, he identifies him as the Toronto-based CEO of Press Monitor and says the company’s services are used by the Indian government.
‘Spamoflag’
A series of pro-Beijing operations point to the field’s capacity for rapid evolution.
Graphika, a digital research firm, has been tracking a network it calls “pseudonyms” since 2019.spam”, for its early reliance on spamming social platforms with content that reflects Beijing’s line on geopolitical issues. Most posts received little or no engagement.
However, in recent months developed hundreds of accounts with detailed contacts. They each have their own profile and post history that may look unique. It turned out that they came from many different countries and from different areas of life.
Graphika has traced it back to a Bangladeshi content farm that creates the accounts in bulk and possibly sells them to a third party.
The network promotes harsh criticism of Hong Kong democracy activists and American foreign policy. By coordinating without seeming like it, he created an image of organic change in public and often garnered attention.
The accounts have been bolstered by a large media network in Panama, prominent politicians in Pakistan and Chile, Chinese-speaking YouTube pages, left-wing British commentator George Galloway, and a host of Chinese diplomatic accounts.
A separate pro-Beijing network, open It operated hundreds of Chinese websites and social media accounts by a Taiwanese research organization called The Reporter.
Disguised as news sites and citizens’ groups, they supported Taiwan’s reunification with mainland China and humiliated Hong Kong’s protesters. The report found links between pages with a Malaysia-based startup offering Singapore dollars to web users to promote content.
But Mr. Brookie said governments could see that outsourcing such obscure business also carries risks. First, firms are more difficult to control and may turn to spam or tactics.
For another, firms that organize around deception may be just as likely to turn these energies into customers, inflated budgets, and billing for work that is never done.
“As a result, scammers will be circulating online,” he said.
[ad_2]
Source link