Astroturfing

Last updated

Astroturfing is the deceptive practice of hiding the sponsors of an orchestrated message or organization (e.g., political, advertising, religious, or public relations) to make it appear as though it originates from, and is supported by, unsolicited grassroots participants. It is a practice intended to give the statements or organizations credibility by withholding information about the source's financial backers.

Contents

The implication behind the use of the term is that instead of a "true" or "natural" grassroots effort behind the activity in question, there is a "fake" or "artificial" appearance of support.

Definition

Artificial grass produced by AstroTurf, which inspired the name "astroturfing" for creating a false impression of grassroots support Green (astroturf) - Flickr - jakerome.jpg
Artificial grass produced by AstroTurf, which inspired the name "astroturfing" for creating a false impression of grassroots support

In political science, it is defined as the process of seeking electoral victory or legislative relief for grievances by helping political actors find and mobilize a sympathetic public, and is designed to create the image of public consensus where there is none. [1] [2] Astroturfing is the use of fake grassroots efforts that primarily focus on influencing public opinion and typically are funded by corporations and political entities to form opinions. [3]

On the internet, astroturfers use software to hide their identity. Sometimes one individual operates through many personas to give the impression of widespread support for their client's agenda. [4] [5] Some studies suggest astroturfing can alter public viewpoints and create enough doubt to inhibit action. [6] [7] In the first systematic study of astroturfing in the United States, Oxford Professor Philip N. Howard argued that the internet was making it much easier for powerful lobbyists and political movements to activate small groups of aggrieved citizens to have an exaggerated importance in public policy debates. [2] Astroturfed accounts on social media do not always require humans to write their posts; one January 2021 study detailed a "set of human-looking bot accounts" used to post political content, which was able to operate automatically for fourteen days (and make 1,586 posts) before being detected and suspended by Twitter. [8] Twitter trends are often targeted by astroturfing as they are used as a proxy for popularity. A study conducted by researchers at EPFL reported that 20% of the global Twitter trends in 2019 were fake, created automatically using fake and compromised accounts which tweet in a coordinated way to mimic grassroots organizing of regular Twitter users. [9]

Policies and enforcement

Many countries have laws that prohibit more overt astroturfing practices. [10] In the United States, the Federal Trade Commission (FTC) may send cease-and-desist orders or require a fine of $16,000 per day for those that violate its "Guides Concerning the Use of Endorsements and Testimonials in Advertising". [10] [11] The FTC's guides were updated in 2009 to address social media and word-of-mouth marketing. [12] [13] According to an article in the Journal of Consumer Policy , the FTC's guides holds advertisers responsible for ensuring bloggers or product endorsers comply with the guides, and any product endorsers with a material connection are required to provide honest reviews. [10]

In the European Union, the Unfair Commercial Practices Directive requires that paid-for editorial content in the media provide a clear disclosure that the content is a sponsored advertisement. [10] Additionally, it prohibits those with a material connection from misleading readers into thinking they are a regular consumer. [10]

The United Kingdom has the Consumer Protection from Unfair Trading Regulations, [14] which prohibits "Falsely representing oneself as a consumer." They allow for up to two years in prison and unlimited fines for breaches. [10] Additionally, the advertising industry in the UK has adopted many voluntary policies, such as the Code of Non-Broadcast Advertising, Sale, Promotion and Direct Marketing. A trade association, the Advertising Standards Authority, investigates complaints of breaches. The policy requires that marketing professionals not mislead their audience, including by omitting a disclosure of their material connection. [10]

In Australia, astroturfing is regulated by Section 18 of the Australian Consumer Law, which broadly prohibits "misleading and deceptive conduct". According to the Journal of Consumer Policy, Australia's laws, which were introduced in 1975, are more vague. In most cases, they are enforced through lawsuits from competitors, rather than the regulatory body, the Australian Competition & Consumer Commission. [10] There is also an International Consumer Protection and Enforcement Network (ICPEN). [15]

Legal regulations are primarily targeted towards testimonials, endorsements and statements as to the performance or quality of a product. Employees of an organization may be considered acting as customers if their actions are not guided by authority within the company. [15]

In October 2018, after denying that they had paid for people to show up in support of a controversial power plant development project in New Orleans, Entergy was fined five million dollars for using astroturf firm The Hawthorn Group to provide actors to prevent real community members' voices from being counted at city council meetings and show false grassroots support. [16]

Debate

Effectiveness

In the book Grassroots for Hire: Public Affairs Consultants in American Democracy, Edward Walker defines "astroturfing" as public participation that is perceived as heavily incented, as fraudulent (claims are attributed to those who did not make such statements), or as an elite campaign masquerading as a mass movement. [17] Although not all campaigns by professional grassroots lobbying consultants meet this definition, the book finds that the elite-sponsored grassroots campaigns often fail when they are not transparent about their sources of sponsorship and/or fail to develop partnerships with constituencies that have an independent interest in the issue. Walker highlights the case of Working Families for Wal-Mart, in which the campaign's lack of transparency led to its demise.

A study published in the Journal of Business Ethics examined the effects of websites operated by front groups on students. It found that astroturfing was effective at creating uncertainty and lowering trust about claims, thereby changing perceptions that tend to favor the business interests behind the astroturfing effort. [3] The New York Times reported that "consumer" reviews are more effective, because "they purport to be testimonials of real people, even though some are bought and sold just like everything else on the commercial Internet." [18] Some organizations feel that their business is threatened by negative comments, so they may engage in astroturfing to drown them out. [19] Online comments from astroturfing employees can also sway the discussion through the influence of groupthink. [20]

Justification

Some astroturfing operatives defend their practice. [21] Regarding "movements that have organized aggressively to exaggerate their sway", author Ryan Sager said that this "isn't cheating. Doing everything in your power to get your people to show up is basic politics." [22] According to a Porter/Novelli executive, "There will be times when the position you advocate, no matter how well framed and supported, will not be accepted by the public simply because you are who you are." [23]

Impact on society

Data-mining expert Bing Liu (University of Illinois Chicago) estimated that one-third of all consumer reviews on the Internet are fake. [18] According to The New York Times, this has made it hard to tell the difference between "popular sentiment" and "manufactured public opinion". [24] According to an article in the Journal of Business Ethics, astroturfing threatens the legitimacy of genuine grassroots movements. The authors argued that astroturfing that is "purposefully designed to fulfill corporate agendas, manipulate public opinion and harm scientific research represents a serious lapse in ethical conduct." [3] A 2011 report found that often paid posters from competing companies are attacking each other in forums and overwhelming regular participants in the process. [25] George Monbiot said that persona-management software supporting astroturfing "could destroy the Internet as a forum for constructive debate". [26] An article in the Journal of Consumer Policy said that regulators and policy makers needed to be more aggressive about astroturfing. The author said that it undermines the public's ability to inform potential customers of sub-standard products or inappropriate business practices, but also noted that fake reviews were difficult to detect. [10]

Techniques

Use of one or more front groups is one astroturfing technique. These groups typically present themselves as serving the public interest, while actually working on behalf of a corporate or political sponsor. [27] Front groups may resist legislation and scientific consensus that is damaging to the sponsor's business by emphasizing minority viewpoints, instilling doubt and publishing counterclaims by corporate-sponsored experts. [3] Fake blogs can also be created that appear to be written by consumers, while actually being operated by a commercial or political interest. [28] Some political movements have provided incentives for members of the public to send a letter to the editor at their local paper, often using a copy and paste form letter that is published in dozens of newspapers verbatim. [29]

Another technique is the use of sockpuppets, where a single person creates multiple identities online to give the appearance of grassroots support. Sockpuppets may post positive reviews about a product, attack participants that criticize the organization, or post negative reviews and comments about competitors, under fake identities. [19] [30] Astroturfing businesses may pay staff based on the number of posts they make that are not flagged by moderators. [25] Persona management software may be used so that each paid poster can manage five to seventy convincing online personas without getting them confused. [26] [31] Online astroturfing using sockpuppets is a form of Sybil attack against distributed systems.

Pharmaceutical companies may sponsor patient support groups and simultaneously push them to help market their products. [32] Bloggers who receive free products, paid travel or other accommodations may also be considered astroturfing if those gifts are not disclosed to the reader. [33] Analysts could be considered astroturfing, since they often cover their own clients without disclosing their financial connection. To avoid astroturfing, many organizations and press have policies about gifts, accommodations and disclosures. [34]

Detection

Persona management software can age accounts and simulate the activity of attending a conference automatically to make it more convincing that they are genuine. [35] At HBGary, employees were given separate thumb drives that contain online accounts for individual identities and visual cues to remind the employee which identity they are using at the time. [35]

Mass letters may be printed on personalized stationery using different typefaces, colors and words to make them appear personal. [36]

According to an article in The New York Times, the Federal Trade Commission rarely enforces its astroturfing laws. [18] Operations are frequently detected if their profile images are recognized [37] or if they are identified through the usage patterns of their accounts. [25] Filippo Menczer's group at Indiana University developed software in 2010 that detects astroturfing on Twitter by recognizing behavioral patterns. [38] [39] [40]

Business and adoption

According to an article in the Journal of Consumer Policy, academics disagree on how prolific astroturfing is. [10]

According to Nancy Clark from Precision Communications, grass-roots specialists charge $25 to $75 for each constituent they convince to send a letter to a politician. [36] Paid online commentators in China are purportedly paid 50 cents for each online post that is not removed by moderators, [25] leading to the nickname of the "50-cent party". [20] The New York Times reported that a business selling fake online book reviews charged $999 for 50 reviews and made $28,000 a month shortly after opening. [18]

According to the Financial Times , astroturfing is "commonplace" in American politics, but was "revolutionary" in Europe when it was exposed that the European Privacy Association, an anti-privacy "think-tank", was actually sponsored by technology companies. [41]

History of incidents

Origins

Although the term "astroturfing" was not yet developed, an early example of the practice was in Act 1, Scene 2 of Shakespeare's play Julius Caesar . In the play, Gaius Cassius Longinus writes fake letters from "the public" to convince Brutus to assassinate Julius Caesar. [15]

The term "astroturfing" was first coined in 1985 by Texas Democratic Party senator Lloyd Bentsen when he said, "a fellow from Texas can tell the difference between grass roots and AstroTurf... this is generated mail." [15] [42] Bentsen was describing a "mountain of cards and letters" sent to his office to promote insurance industry interests. [43]

Pharmaceuticals

Patient advocacy groups funded by biopharmaceutical companies are common. [44] [45] In 1997, Schering Plough paid a P/R firm Schandwick International, to create a national coalition of patient advocacy groups promoting Schering's Rebotron, a treatment for Hepatitis C. The groups pushed increased testing as a way to manufacture cases and lobbied state legislatures to cover the $18,000 treatment. The groups also hosted telephone "information lines" with scripts written by the drug company and distributed "patient information" pamphlets promoting drug therapies over other alternatives and overstating the danger of the medical condition. [46] Manufacturers of AIDS drugs commonly fund LGBTQ organizations, which in turn, lobby to advance policies that increase AIDS drug sales. In 2019, the communications director of AIDS United, a Washington DC–based coalition of AIDS service organizations, resigned, stating such funding creates conflicts of interest among gay rights activists. [47]

Tobacco

In response to the passage of tobacco control legislation in the US, Philip Morris, Burson-Marsteller and other tobacco interests created the National Smokers Alliance (NSA) in 1993. The NSA and other tobacco interests initiated an aggressive public relations campaign from 1994 to 1999 in an effort to exaggerate the appearance of grassroots support for smoker's rights. According to an article in the Journal of Health Communication, the NSA had mixed success at defeating bills that were damaging revenues of tobacco interests. [48]

Internet

Email, automated phone calls, form letters, and the Internet made astroturfing more economical and prolific in the late 1990s. [26] [42] In 2001, as Microsoft was defending itself against an antitrust lawsuit, Americans for Technology Leadership (ATL), a group heavily funded by Microsoft, initiated a letter-writing campaign. ATL contacted constituents under the guise of conducting a poll and sent pro-Microsoft consumers form and sample letters to send to involved lawmakers. The effort was designed to make it appear as though there was public support for a sympathetic ruling in the antitrust lawsuit. [36] [49]

In January 2018, YouTube user Isaac Protiva uploaded a video alleging that internet service provider Fidelity Communications was behind an initiative called "Stop City-Funded Internet", based on how some images on the Stop City-Funded Internet website had "Fidelity" in their file names. [50] The campaign appeared to be in response to the city of West Plains expanding their broadband network, and advocated for the end of municipal broadband on the basis that it was too risky. [51] [52] Days later, Fidelity released a letter admitting to sponsoring the campaign. [53]

Politics

In 2009–2010, an Indiana University research study developed a software system to detect astroturfing on Twitter due to the sensitivity of the topic in the run up to the 2010 U.S. midterm elections and account suspensions on the social media platform. The study cited a limited number of examples, all promoting conservative policies and candidates. [38] [39] [40]

In 2003, GOPTeamLeader.com offered the site's users "points" that could be redeemed for products if they signed a form letter promoting George Bush and got a local paper to publish it as a letter to the editor. More than 100 newspapers published an identical letter to the editor from the site with different signatures on it. Similar campaigns were used by GeorgeWBush.com, and by MoveOn.org to promote Michael Moore's film Fahrenheit 9/11 . [29] [54] The Committee for a Responsible Federal Budget's "Fix the Debt" campaign advocated to reduce government debt without disclosing that its members were lobbyists or high-ranking employees at corporations that aim to reduce federal spending. [55] [56] It also sent op-eds to various students that were published as-is. [57]

Some organizations in the Tea Party movement have been accused of being astroturfed. [58]

In October and November 2018, conservative marketing firm Rally Forge created what The New Yorker described as "a phony left-wing front group, America Progress Now, which promoted Green Party candidates online in 2018, apparently to hurt Democrats in several races." [59] Its ads on Facebook used socialist memes and slogans to attack Democrats and urge third-party protest voting in several tight races, including the 2018 Wisconsin gubernatorial election. [60] [61]

In 2018, a website called "Jexodus" claiming to be by "proud Jewish Millennials tired of living in bondage to leftist politics" was set up by Jeff Ballabon, a Republican operative in his mid-50s. The website was denounced as "likely a clumsy astroturf effort rather than an actual grassroots movement". [62] [63] [64] [65] The website was registered November 5, 2018, before the congressional election, and before those representatives accused of antisemitism had even been voted in. [65] This website was later cited by Donald Trump as though it were an authentic movement. [62]

In January 2021, a team led by Mohsen Mosleh conducted a politically oriented astroturfing campaign on Twitter, using "a set of human-looking bot accounts"; each bot would search for users posting links the researchers considered to be fake news, and "tweet a public reply message to the user's tweet that contained the link to the false story". 1,586 spam replies were made over the course of fourteen days, until Twitter detected and suspended all of the bot accounts. [8]

Environment

The Koch brothers started a public advocacy group to prevent the development of wind turbines offshore in Massachusetts. The Kennedy family was also involved. [66] [67] [68] [69] [70]

Corporate efforts to mobilize the public against environmental regulation accelerated in the US following the election of president Barack Obama. [71]

In 2014, the Toronto Sun conservative media organization has published an article accusing Russia of using astroturf tactics to drum up anti-fracking sentiment across Europe and the West, supposedly in order to maintain dominance in oil exports through Ukraine. [72]

In Canada, a coalition of oil and gas company executives grouped under the Canadian Association of Petroleum Producers also initiated a series of Canadian actions to advocate for the oil and gas industry in Canada through mainstream and social media, and using online campaigning to generate public support for fossil fuel energy projects. [73]

Commercial

In 2006, two Edelman employees created a blog called "Wal-Marting Across America" about two people traveling to Wal-Marts across the country. The blog gave the appearance of being operated by spontaneous consumers, but was actually operated on behalf of Working Families for Walmart, a group funded by Wal-Mart. [74] [75] In 2007, Ask.com deployed an anti-Google advertising campaign portraying Google as an "information monopoly" that was damaging the Internet. The ad was designed to give the appearance of a popular movement and did not disclose it was funded by a competitor. [76]

In 2010, the Federal Trade Commission settled a complaint with Reverb Communications, who was using interns to post favorable product reviews in Apple's iTunes store for clients. [77] In September 2012, one of the first major identified cases of astroturfing in Finland involved criticisms about the cost of a €1.8 billion patient information system, which was defended by fake online identities operated by involved vendors. [37] [78]

In September 2013, New York Attorney General Eric T. Schneiderman announced a settlement with 19 companies to prevent astroturfing. "'Astroturfing' is the 21st century's version of false advertising, and prosecutors have many tools at their disposal to put an end to it," said Scheiderman. The companies paid $350,000 to settle the matter, but the settlement opened the way for private suits as well. "Every state has some version of the statutes New York used," according to lawyer Kelly H. Kolb. "What the New York attorney general has done is, perhaps, to have given private lawyers a road map to file suit." [79] [80]

State-sponsored

An Al Jazeera TV series The Lobby documented Israel's attempt to promote more friendly, pro-Israel rhetoric to influence the attitudes of British youth, partly through influencing already established political bodies, such as the National Union of Students and the Labour Party, but also by creating new pro-Israel groups whose affiliation with the Israeli administration was kept secret. [81] [82]

In 2008, an expert on Chinese affairs, Rebecca MacKinnon, estimated the Chinese government employed 280,000 people in a government-sponsored astroturfing operation to post pro-government propaganda on social media and drown out voices of dissent. [25] [83]

In June 2010, the United States Air Force solicited for "persona management" software that would "enable an operator to exercise a number of different online persons from the same workstation and without fear of being discovered by sophisticated adversaries. Personas must be able to appear to originate in nearly any part of the world and can interact through conventional online services and social media platforms..." [84] The $2.6 million contract was awarded to Ntrepid for astroturfing software the military would use to spread pro-American propaganda in the Middle East, and disrupt extremist propaganda and recruitment. The contract is thought to have been awarded as part of a program called Operation Earnest Voice, which was first developed as a psychological warfare weapon against the online presence of groups ranged against coalition forces. [26] [85] [86] [87]

On April 11, 2022, seven weeks into the 2022 Russian invasion of Ukraine, BBC published the results of investigation of a network of Facebook groups with the overall aim to promote the Russian president Vladimir Putin as a hero standing up to the West with overwhelming international support. Members, activities, and interrelations in 10 pro-Putin public groups with more than 650,000 members between them in the time of writing, boasting names such as Vladimir Putin - Leader of the Free World, were analyzed. Over a month, researchers counted 16,500 posts, receiving more than 3.6 million interactions. The campaign "creates the appearance of widespread support for Putin and the Kremlin in the shadow of the invasion and relies on... inauthentic accounts to accomplish its goal", according to a report. Lead researcher Moustafa Ayad described the network and its practice of using tens of duplicate accounts in potential violation of Facebook's rules on inauthentic behavior as an example of astroturfing. [88] [89]

See also

Related Research Articles

Disinformation is false information deliberately spread to deceive people. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals. Disinformation is implemented through attacks that "weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value judgements—to exploit and amplify culture wars and other identity-driven controversies."

<span class="mw-page-title-main">Guerrilla marketing</span> Unconventional advertising strategy

Guerrilla marketing is an advertisement strategy in which a company uses surprise and/or unconventional interactions in order to promote a product or service. It is a type of publicity. The term was popularized by Jay Conrad Levinson's 1984 book Guerrilla Marketing.

<span class="mw-page-title-main">Media manipulation</span> Techniques in which partisans create an image that favours their interests

Media manipulation refers to orchestrated campaigns in which actors exploit the distinctive features of broadcasting mass communications or digital media platforms to mislead, misinform, or create a narrative that advance their interests and agendas.

A grassroots movement is one that uses the people in a given district, region or community as the basis for a political or continent movement. Grassroots movements and organizations use collective action from the local level to implement change at the local, regional, national, or international levels. Grassroots movements are associated with bottom-up, rather than top-down decision-making, and are sometimes considered more natural or spontaneous than more traditional power structures.

Internet activism involves the use of electronic-communication technologies such as social media, e-mail, and podcasts for various forms of activism to enable faster and more effective communication by citizen movements, the delivery of particular information to large and specific audiences, as well as coordination. Internet technologies are used by activists for cause-related fundraising, community building, lobbying, and organizing. A digital-activism campaign is "an organized public effort, making collective claims on a target authority, in which civic initiators or supporters use digital media." Research has started to address specifically how activist/advocacy groups in the U.S. and in Canada use social media to achieve digital-activism objectives.

Reputation management, originally a public relations term, refers to the influencing, controlling, enhancing, or concealing of an individual's or group's reputation. The growth of the internet and social media led to growth of reputation management companies, with search results as a core part of a client's reputation. Online reputation management, sometimes abbreviated as ORM, focuses on the management of product and service search engine results.

<span class="mw-page-title-main">Sock puppet account</span> False online identity used for deception

A sock puppet is a false online identity used for deceptive purposes. The term originally referred to a hand puppet made from a sock. Sock puppets include online identities created to praise, defend, or support a person or organization, to manipulate public opinion, or to circumvent restrictions such as viewing a social media account that a user is blocked from. Sock puppets are unwelcome in many online communities and forums.

<span class="mw-page-title-main">Yelp</span> Directory service and online review forum

Yelp Inc. is an American company that develops the Yelp.com website and the Yelp mobile app, which publishes crowd-sourced reviews about businesses. It also operates Yelp Guest Manager, a table reservation service. It is headquartered in San Francisco.

A fake blog is an electronic communication in the blog format that appears to originate from a credible, non-biased and independent source, but which in fact is created by a company or organization for the purpose of marketing a product, service, or political viewpoint. The purpose of a fake blog is to inspire viral marketing or create an internet meme that generates traffic and interest in a product, much the same as astroturfing.

<span class="mw-page-title-main">DeSmog</span> Journalist organisation focusing on topics related to global warming

DeSmog, founded in January 2006, is an international journalism organization that focuses on topics related to climate change. DeSmog's emphasis is investigating and reporting on misinformation campaigns and organizations opposing climate science and action. The site was founded, originally in blog format, by James Hoggan, president of a public relations firm based in Vancouver, British Columbia, Canada. DeSmog is a partner in the Covering Climate Now project which organizes and assists news organizations cover climate change worldwide. DeSmog also maintains several databases of persons and organizations engaged in misinformation and lobbying against addressing climate change.

Grassroots lobbying is lobbying with the intention of reaching the legislature and making a difference in the decision-making process. Grassroots lobbying is an approach that separates itself from direct lobbying through the act of asking the general public to contact legislators and government officials concerning the issue at hand, as opposed to conveying the message to the legislators directly. Companies, associations and citizens are increasingly partaking in grassroots lobbying as an attempt to influence a change in legislation.

Philip N. Howard is a sociologist and communication researcher who studies the impact of information technologies on democracy and social inequality. He studies how new information technologies are used in both civic engagement and social control in countries around the world. He is Professor of Internet Studies at the Oxford Internet Institute and Balliol College at the University of Oxford. He was Director of the Oxford Internet Institute from March 2018 to March 26, 2021. He is the author of ten books, including New Media Campaigns and The Managed Citizen, The Digital Origins of Dictatorship and Democracy, and Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up. His latest book is Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives.

An Internet water army is a group of users who are paid to post online comments with vested interest on Chinese language websites. Internet water armies started in the early 2010s. They post news, comments, gossip and disinformation on online platforms like Weibo, WeChat and Taobao, China's eBay-like platform. In this astroturfing technique for public relations and media manipulation, online Chinese companies employ people to post on social media to change public opinion. It has been developed into an industry in which a company specializing in internet water armies can earn 7.6 million yuan within three months and has made over 2500 transactions. The private Internet water army operations parallel the official 50 Cent Party internet commentators hired by the government of the People's Republic of China or the Chinese Communist Party (CCP) to spread propaganda and disinformation.

State-sponsored Internet propaganda is Internet manipulation and propaganda that is sponsored by a state. States have used the Internet, particularly social media to influence elections, sow distrust in institutions, spread rumors, spread disinformation, typically using bots to create and spread contact. Propaganda is used internally to control populations, and externally to influence other societies.

The Internet Research Agency, also known as Glavset, and known in Russian Internet slang as the Trolls from Olgino or Kremlinbots, was a Russian company which was engaged in online propaganda and influence operations on behalf of Russian business and political interests. It was linked to Yevgeny Prigozhin, a former Russian oligarch who was leader of the Wagner Group, and based in Saint Petersburg, Russia.

Leave.EU was a political campaign group that was first established to support the United Kingdom's withdrawal from the European Union in the June 2016 referendum. Founded in July 2015 as The Know, the campaign was relaunched in September of that year with its name changed to "Leave.eu" to reflect altered wording in the referendum question.

<span class="mw-page-title-main">Fake news</span> False or misleading information presented as real

Fake news or information disorder is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue. Although false news has always been spread throughout history, the term fake news was first used in the 1890s when sensational reports in newspapers were common. Nevertheless, the term does not have a fixed definition and has been applied broadly to any type of false information presented as news. It has also been used by high-profile people to apply to any news unfavorable to them. Further, disinformation involves spreading false information with harmful intent and is sometimes generated and propagated by hostile foreign actors, particularly during elections. In some definitions, fake news includes satirical articles misinterpreted as genuine, and articles that employ sensationalist or clickbait headlines that are not supported in the text. Because of this diversity of types of false news, researchers are beginning to favour information disorder as a more neutral and informative term.

<span class="mw-page-title-main">Propaganda through media</span> Use of media for propaganda

Propaganda is a form of persuasion that is often used in media to further some sort of agenda, such as a personal, political, or business agenda, by evoking an emotional or obligable response from the audience. It includes the deliberate sharing of realities, views, and philosophies intended to alter behavior and stimulate people to act.

Internet manipulation is the co-optation of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes. Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication. When employed for political purposes, internet manipulation may be used to steer public opinion, polarise citizens, circulate conspiracy theories, and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation. Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship or selective violations of net neutrality.

Social media use in politics refers to the use of online social media platforms in political processes and activities. Political processes and activities include all activities that pertain to the governance of a country or area. This includes political organization, global politics, political corruption, political parties, and political values. The media's primary duty is to present us with information and alert us when events occur. This information may affect what we think and the actions we take. The media can also place pressure on the government to act by signaling a need for intervention or showing that citizens want change

References

  1. Howard, Philip N. (2003). "Digitizing the Social Contract: Producing American Political Culture in the Age of New Media". The Communication Review . 6 (3): 213–45. doi:10.1080/10714420390226270. S2CID   145413399. Archived from the original on November 16, 2023. Retrieved September 30, 2020.
  2. 1 2 Howard, Philip (2005). New Media Campaigns and the Managed Citizen. New York, NY: Cambridge University Press. pp.  93, 144. ISBN   9780521612272.
  3. 1 2 3 4 Cho, Charles H.; Martens, Martin L.; Kim, Hakkyun; Rodrigue, Michelle (2011). "Astroturfing Global Warming: It Isn't Always Greener on the Other Side of the Fence". Journal of Business Ethics . 104 (4): 571–587. doi:10.1007/s10551-011-0950-6. ISSN   0167-4544. S2CID   154213597.
  4. Doctorow, Cory (February 18, 2011). "HBGary's high-volume astroturfing technology and the Feds who requested it". boingboing. Archived from the original on July 17, 2013. Retrieved June 28, 2013.
  5. Ludlow, Peter (June 18, 2013). "The Strange Case of Barrett Brown". The Nation . Archived from the original on June 27, 2013. Retrieved June 28, 2013.
  6. Lyon, Thomas P.; Maxwell, John W. (2004). "Astroturf: Interest Group Lobbying and Corporate Strategy" (PDF). Journal of Economics & Management Strategy . 13 (4): 561–597. doi:10.1111/j.1430-9134.2004.00023.x. hdl: 2027.42/74741 . S2CID   44209882. Archived from the original (PDF) on August 11, 2017. Retrieved July 12, 2019.
  7. Morales, Juan S. (2020). "Perceived Popularity and Online Political Dissent: Evidence from Twitter in Venezuela". International Journal of Press/Politics . 25: 5–27. doi: 10.1177/1940161219872942 . S2CID   203053725.
  8. 1 2 Mosleh, Mohsen; Martel, Cameron; Eckles, Dean; Rand, David (May 6, 2021). "Perverse Downstream Consequences of Debunking: Being Corrected by Another User for Posting False Political News Increases Subsequent Sharing of Low Quality, Partisan, and Toxic Content in a Twitter Field Experiment". Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. pp. 1–13. doi:10.1145/3411764.3445642. ISBN   9781450380966. S2CID   233987905 via ACM Digital Library.
  9. Elmas, Tuğrulcan; Overdorf, Rebekah; Özkalay, Ahmed Furkan; Aberer, Karl (2021). "Ephemeral Astroturfing Attacks: The Case of Fake Twitter Trends". 6th IEEE European Symposium on Security and Privacy. Virtual: IEEE. arXiv: 1910.07783 .
  10. 1 2 3 4 5 6 7 8 9 10 Malbon, Justin (2013). "Taking Fake Online Consumer Reviews Seriously". Journal of Consumer Policy . 36 (2): 139–157. doi:10.1007/s10603-012-9216-7. ISSN   0168-7034. S2CID   153986049.
  11. "Guides Concerning the Use of Endorsements and Testimonials in Advertising" (PDF). Federal Trade Commission. Archived (PDF) from the original on June 9, 2014. Retrieved June 20, 2014.
  12. Foresman, Chris (August 27, 2010). "PR firm settles with FTC over alleged App Store astroturfing". Ars Technica . Archived from the original on October 27, 2012. Retrieved November 7, 2012.
  13. Roberts, Jeff (April 26, 2012). "The ethics of astro-turfing: sleazy or smart business?". Giga Om. Archived from the original on November 21, 2015. Retrieved June 20, 2014.
  14. OUTLAW.COM (December 8, 2009). "EU rolls out out astroturf guide for consumer laws". The Register. Archived from the original on November 18, 2018. Retrieved November 10, 2012.
  15. 1 2 3 4 Kolivos, Eugenia, and Anna Kuperman. "Web Of Lies – Legal Implications Of Astroturfing." Keeping Good Companies (14447614) 64.1 (2012): 38-41. Business Source Complete. Web. 10 November 2012.
  16. Mazza, Juliana (October 30, 2018). "Report: Entergy Knew It Was Paying for Actors at Not 1, but 2 Meetings Archived October 31, 2018, at the Wayback Machine ." WDSU . Retrieved from WDSU.com, February 3, 2019.
  17. Walker, Edward (2014). Grassroots for Hire: Public Affairs Consultants in American Democracy. Cambridge and New York: Cambridge University Press. p. 33. ISBN   9781107619012. Archived from the original on April 16, 2014. Retrieved April 5, 2014.
  18. 1 2 3 4 Streitfeld, David (August 25, 2012). "The Best Book Reviews Money Can Buy". The New York Times. Archived from the original on December 31, 2019. Retrieved October 25, 2012.
  19. 1 2 "Company Settles With State Attorney General Over Fake Online Customer Reviews." Computer & Internet Lawyer 26.10 (2009): 32. Computers & Applied Sciences Complete. Web. 11 November 2012.
  20. 1 2 Bristow, Michael (December 16, 2008). "China's internet 'spin doctors'". BBC News. Archived from the original on August 7, 2018. Retrieved November 10, 2013.
  21. Ben Smith (August 21, 2009). "The Summer of Astroturf". Politico. Archived from the original on August 23, 2009. Retrieved August 28, 2009.
  22. Sanger, Ryan (August 19, 2009). "Keep Off the Astroturf" . The New York Times. Archived from the original on December 25, 2016. Retrieved August 26, 2009.
  23. Beder, Sharon (Summer 1998). "Public Relations' Role in Manufacturing Artificial Grass Roots Coalitions". Public Relations Quarterly. 43 (2): 21–3. Archived from the original on July 18, 2015. Retrieved April 23, 2011.
  24. Grandia, Kevin (August 26, 2009). "Bonner & Associates: The Long and Undemocratic History of Astroturfing". Huffington Post . Archived from the original on August 30, 2009. Retrieved November 7, 2012.
  25. 1 2 3 4 5 Cheng Chen; Kui Wu; Venkatesh Srinivasan; Xudong Zhang (November 18, 2011). "Battling the Internet Water Army: Detection of Hidden Paid Posters". arXiv: 1111.4297 [cs.SI].
  26. 1 2 3 4 Monbiot, George (February 24, 2011). "The need to protect the internet from 'astroturfing' grows ever more urgent". The Guardian . London, UK. Archived from the original on February 23, 2011. Retrieved February 24, 2011.
  27. Monbiot, George (September 18, 2006). "The denial industry". The Guardian . London. Archived from the original on April 28, 2012. Retrieved September 14, 2012.
  28. Plummer, Robert (May 22, 2008). "Will fake business blogs crash and burn?". BBC News . Archived from the original on September 20, 2018. Retrieved November 7, 2012.
  29. 1 2 "Online Journalism Review; August 24, 2004". September 12, 2004. Archived from the original on September 12, 2004. Retrieved August 1, 2011.
  30. "Good and bad reviews: The ethical debate over 'astroturfing'". The Guardian. London. January 9, 2011. Archived from the original on June 29, 2013. Retrieved November 17, 2012.
  31. Friel, Alan (October 2009). "FTC's New Endorsement Guides Call for Policies and Procedures". Wildman Harrold.
  32. "Astroturfing". New Scientist. 193 (2590): 48. 2007. doi:10.1016/s0262-4079(07)60361-3.
  33. Slutsky, Irina (February 24, 2011). "'Organic' SXSW Blogger Buzz? More Like Marketing Astroturf". Ad Age . Archived from the original on October 4, 2013. Retrieved November 9, 2012.
  34. Roberts, Jeff (April 26, 2012). "The ethics of astro-turfing". PaidContent. Archived from the original on April 26, 2012. Retrieved September 10, 2012.
  35. 1 2 Rockefeller, Happy (February 16, 2011). "UPDATED: The HB Gary Email That Should Concern Us All". Daily Kos. Archived from the original on February 21, 2012. Retrieved November 13, 2012.
  36. 1 2 3 Menn, Joseph; Edmund Sanders (August 23, 2001). "Lobbyists Tied to Microsoft Wrote Citizens' Letters". The LA Times. Archived from the original on October 4, 2013. Retrieved November 19, 2012.
  37. 1 2 "Husin tietojärjestelmän puolustajaa arveltiin keksityksi". September 13, 2012. Archived from the original on September 15, 2012. Retrieved November 18, 2012.
  38. 1 2 Ratkiewicz, Jacob; Conover, Michael; Meiss, Mark; Gonçalves, Bruno; Snehal Patil; Alessandro Flammini; Filippo Menczer (2011). "Truthy: mapping the spread of astroturf in microblog streams". Proceedings of the 20th international conference companion on World wide web. New York, NY, USA: Association for Computing Machinery. pp. 249–252. arXiv: 1011.3768 . doi:10.1145/1963192.1963301. ISBN   978-1-4503-0637-9 . Retrieved April 25, 2011.
  39. 1 2 Ratkiewicz, Jacob; Conover, Michael; Meiss, Mark; Gonçalves, Bruno; Alessandro Flammini; Filippo Menczer (November 16, 2010). "Detecting and Tracking the Spread of Astroturf Memes in Microblog Streams". Proceedings of the 20th International Conference Companion on World Wide Web. p. 249. arXiv: 1011.3768 . doi:10.1145/1963192.1963301. ISBN   9781450306379.
  40. 1 2 Ratkiewicz, Jacob; Conover, Michael; Meiss, Mark; Gonçalves, Bruno; Snehal Patil; Alessandro Flammini; Filippo Menczer (July 17–21, 2011). "Detecting and Tracking Political Abuse in Social Media". Proceedings of the Fifth International Conference on Weblogs and Social Media. Menlo Park, CA, USA: Association for the Advancement of Artificial Intelligence. ISBN   978-1-57735-505-2. Archived from the original (PDF) on May 7, 2019. Retrieved August 24, 2011.
  41. Fontanella-Khan, James (June 27, 2013). "Astroturfing takes root; Brussels". Financial Times . Archived from the original on December 10, 2022.
  42. 1 2 Rosemarie Ostler (September 6, 2011). Slinging Mud: Rude Nicknames, Scurrilous Slogans, and Insulting Slang from Two Centuries of American Politics. Penguin Books. pp. 141–. ISBN   978-1-101-54413-6 . Retrieved November 9, 2012.
  43. Wade, Alex (January 9, 2011). "Good and bad reviews: The ethical debate over 'astroturfing'". The Guardian. London. Archived from the original on June 29, 2013. Retrieved November 18, 2012.
  44. "This Is How Big Pharma Wins". New York Magazine. February 21, 2022. Archived from the original on March 15, 2023. Retrieved March 15, 2023. The funding and creation of front groups and astroturf "partnerships" is a ceaseless churn, with outfits activated and retired as needed...Some of these groups are easily identified as astroturf organizations, but industry cash and messaging are also laundered through established national organizations
  45. Kopp, Emily (April 6, 2018). "Patient Advocacy Groups Take In Millions From Drugmakers. Is There A Payback?". Kaiser Health News.
  46. O'Harrow, Robert (September 12, 2000). "Grass Roots Seeded by Drugmaker". Washington Post. Archived from the original on August 28, 2017. Retrieved March 15, 2023. The drugmaker's campaign offers a vivid look at a public relations tactic gaining currency in corporate America: The use of "AstroTurf," or "grass-tops," groups posing as authentic local organizations to promote a product or political aim.
  47. Chibbaro, Lou (August 27, 2019). "homepage news AIDS group official resigns over group's acceptance of drug company funds". Washington Blade. Archived from the original on February 12, 2023. Retrieved February 12, 2023.
  48. Givel, Michael (2007). "Consent and Counter-Mobilization: The Case of The National Smokers Alliance". Journal of Health Communication . 12 (4): 339–357. doi:10.1080/10810730701326002. ISSN   1081-0730. PMID   17558787. S2CID   20124171.
  49. Menn, Joseph; Sanders, Edmund (August 21, 2001). "Report: Microsoft funded 'grass roots' campaign". Associated Press. Archived from the original on October 4, 2013. Retrieved November 19, 2012.
  50. Archived at Ghostarchive and the Wayback Machine : Isaac Protiva (January 31, 2018). Proof that Fidelity Communications is behind the Stop City Funded Internet campaign (Video). Retrieved March 10, 2018.
  51. "Stop City-Funded Internet". Archived from the original on January 29, 2018. Retrieved March 10, 2018.
  52. Einenkel, Walter. "It turns out the Missouri grassroots "Stop City-Funded Internet" movement was a cable monopoly". Daily Kos. Archived from the original on July 24, 2019. Retrieved March 10, 2018.
  53. Einenkel, Walter (February 8, 2018). "It turns out the Missouri grassroots 'Stop City-Funded Internet' movement was a cable monopoly". Daily Kos. Archived from the original on July 24, 2019. Retrieved March 10, 2018.
  54. Pulizzi, Henry J. (August 5, 2009). "White House Brushes Off Health-Care Protests". Wall Street Journal . ISSN   0099-9660. Archived from the original on June 21, 2019. Retrieved April 25, 2019.
  55. Cook, Nancy (November 26, 2012). "Billionaire Peterson Sounds Alarm on Deficit". National Journal . Archived from the original on November 30, 2012. Retrieved November 23, 2013.
  56. Confessore, Nicholas (January 9, 2013). "Public Goals, Private Interests in Debt Campaign". The New York Times. Archived from the original on April 16, 2013. Retrieved November 22, 2013.
  57. Crabbe, Nathan. "Using other people's words as your own". Gainesville Sun . Retrieved April 25, 2019.
  58. Nella Van Dyke; David S. Meyer (February 24, 2016). Understanding the Tea Party Movement. Routledge. p. 41. ISBN   978-1-317-00457-8.
  59. Mayer, Jane (August 2, 2021). "The Big Money Behind the Big Lie". The New Yorker . Archived from the original on August 2, 2021. Retrieved September 20, 2021. Rally Forge also created a phony left-wing front group, America Progress Now, which promoted Green Party candidates online in 2018, apparently to hurt Democrats in several races.
  60. Wong, Julia (June 11, 2021). "Revealed: rightwing firm posed as leftist group on Facebook to divide Democrats". The Guardian. San Francisco. Archived from the original on September 27, 2021. Retrieved September 26, 2021. One [Facebook] product manager .. described it as 'a crystal clear example of astroturfing' – deceptive campaign tactics designed to appear as grassroots actions
  61. "Sample ads placed by "America Progress Now"". Facebook . Archived from the original on November 16, 2023. Retrieved March 13, 2024.
  62. 1 2 Adam Peck (March 14, 2019). "Republicans don't want to curb anti-Semitism; they want to weaponize it". Think Progress. Archived from the original on March 17, 2019. Retrieved March 17, 2019.
  63. Talia Lavin (March 14, 2019). "Why the GOP Isn't Getting the Jewish Vote Anytime Soon". GQ.com. Condé Nast. Archived from the original on March 24, 2019. Retrieved March 24, 2019. it's an operation entirely engineered by conservative flacks, doing its best to masquerade as an authentic grassroots movement.
  64. Jack Holmes (March 12, 2019). "Trump's Patrick Moore Tweet Is Fox News Regurgitation at Its Most Dangerous". Esquire. Archived from the original on March 24, 2019. Retrieved March 24, 2019.
  65. 1 2 Joshua Davidovich (March 12, 2019). "Right of passage: 8 things to know for March 12". The Times of Israel. Archived from the original on March 27, 2019. Retrieved March 24, 2019.
  66. Greg Turner (January 12, 2013). "Anti-Cape Wind funder blows $19.5M on Osterville estate". Boston Herald. Archived from the original on November 9, 2018. Retrieved November 9, 2018.
  67. Tim Doyle (September 21, 2006). "Koch's New Fight". Forbes . Archived from the original on November 9, 2018. Retrieved November 9, 2018.
  68. Walter Brooks (May 28, 2013). "The men behind those anti-wind farm ads". Cap Cod Today. Archived from the original on November 9, 2018. Retrieved November 9, 2018.
  69. "Kennedys, Kochs help kill planned wind farm off Cape Cod". Fox News. December 4, 2017. Archived from the original on November 10, 2018. Retrieved November 9, 2018.
  70. Katharine Q. Seelye (December 19, 2017). "After 16 Years, Hopes for Cape Cod Wind Farm Float Away". New York Times. Archived from the original on November 9, 2018. Retrieved November 9, 2018.
  71. Lee, Caroline (Winter 2010). "The Roots Of Astroturfing". Contexts . 9: 73–75. doi: 10.1525/ctx.2010.9.1.73 . ISSN   1536-5042.
  72. Valiante, Giuseppe (June 20, 2014). "Feds weigh in on allegations Russia behind anti-fracking movement". Toronto Sun . Archived from the original on March 8, 2018. Retrieved April 25, 2019.
  73. Linnett, Carol; Gutstein, Donald (July 22, 2015). "'Grassroots' Canada Action Carries Deep Ties to Conservative Party, Oil and Gas Industry". The Narwhal. Archived from the original on June 17, 2018. Retrieved April 25, 2019.
  74. "PR firm admits it's behind Wal-Mart blogs". CNN. October 20, 2006. Archived from the original on January 24, 2019. Retrieved November 10, 2008.
  75. Stoff, Rick. "Astroturf-Roots Campaign." St. Louis Journalism Review 36)2 (2006): 12-21. Communication & Mass Media Complete. Web. 11 November 2012.
  76. Patrick, Aaron (April 5, 2007). "Ask.Com's 'Revolt' Risks Costly Clicks". The Wall Street Journal. Archived from the original on October 4, 2013. Retrieved November 18, 2012.
  77. Gross, Grant (August 26, 2010). "FTC settles complaint about fake video game testimonials". Reuters . Archived from the original on December 7, 2015. Retrieved September 25, 2012.
  78. "Laitos-lehti: Keksitty henkilö kehuu Husin tietojärjestelmää". TS.fi. September 13, 2012. Archived from the original on October 19, 2017. Retrieved November 18, 2012.
  79. Brush, Pete (September 23, 2013). "NY 'Astroturfing' Cases Mark Fertile Ground For Civil Suits". Law360. LexisNexis. Archived from the original on February 24, 2014. Retrieved February 20, 2014.
  80. "A.G. Schneiderman Announces Agreement With 19 Companies To Stop Writing Fake Online Reviews And Pay More Than $350,000 In Fines". New York State Office of the Attorney General. Archived from the original on September 26, 2013. Retrieved February 20, 2014.
  81. MacAskill, Ewen; Cobain, Ian (January 8, 2017). "Israeli diplomat who plotted against MPs also set up political groups". The Guardian. Archived from the original on December 29, 2021. Retrieved December 29, 2021. He also says Robin should not tell other people that the embassy has established the group. 'LFI [Labour Friends of Israel] is an independent organisation. No one likes that someone is managing his organisation. That really is the first rule in politics.'
  82. Sirkes, Sue (February 8, 2018). "American pro-Israel lobby girds for Al Jazeera exposé". Times of Israel. Archived from the original on February 6, 2021. Retrieved December 29, 2021. UK's official media watchdog, Ofcom, rejected a complaint against an earlier Al Jazeera documentary that exposed an Israeli embassy official attempting to influence British lawmakers. Ofcom said the network's reporting, which led to the resignation of Shai Masot, who was filmed plotting to 'take down' British lawmakers seen as unfriendly to Israel, was not anti-Semitic.
  83. Anderson, Nate (March 26, 2010). "280,000 pro-China astroturfers are running amok online". Ars Technica. Archived from the original on July 26, 2019. Retrieved November 7, 2012.
  84. "Persona Management Software. Solicitation Number: RTB220610". Archived from the original on February 23, 2011. Retrieved October 12, 2012.
    "Mirror" (PDF). Washington Post . Archived from the original (PDF) on October 19, 2017. Retrieved August 24, 2017.
  85. Stephen C. Webster (February 22, 2011). "Military's 'persona' software cost millions, used for 'classified social media activities'". The Raw Story. Archived from the original on February 23, 2011. Retrieved February 24, 2011.
  86. Darlene Storm (February 22, 2011). "Army of fake social media friends to promote propaganda". Computerworld Inc. Archived from the original on February 24, 2011. Retrieved February 24, 2011.
  87. Fielding, Nick; Ian Cobain (March 17, 2011). "Revealed: US spy operation that manipulates social media". The Guardian. London. Archived from the original on June 10, 2016. Retrieved November 12, 2012.
  88. Jack Goodman, Olga Robinson (April 11, 2022). "Putin's mysterious Facebook 'superfans' on a mission". BBC . Archived from the original on April 12, 2022. Retrieved April 11, 2022.
  89. "Russian propaganda efforts aided by pro-Kremlin content creators, research finds". NBC News. NBC. June 8, 2022. Archived from the original on June 10, 2022. Retrieved November 24, 2023. Some of the disinformation that we see spread quickly isn't being fact-checked because they're reaching an audience that is deemed to be smaller or less important than that reached by RT and Sputnik, but the talking points are the same and the evidence being presented is the same

Further reading