The next presenter at AoIR 2016 is Adrian Rauchfleisch, whose interest is in digital astroturfing in politics. There have been a number of documented cases of political candidates suddenly picking up substantial numbers of Twitter followers overnight, presumably both because they bought followers themselves and because their opponents created fake followers to embarrass them. There is also a Russian outfit called The Agency, posting pro-Putin comments on international news Websites that purport to be from ordinary users in the west, and similar pro-China astroturfing has also been observed. Such astroturfing is not the same as trolling: trolls are self-motivated, rather than acting as agents of defined political interests.
There are a number of different forms, formats, and strategies for such astroturfing, but the common element is that they are designed to change the perception of political actors, and that they can have a genuine effect on political issues. This has been called astroturfing, referencing the fake grass product, in contrast to actual grassroots online media activity. It is important to research this both to understand the dynamics of such astroturfing, and to find mechanisms for detecting and counteracting it.
Digital astroturfing, then, is a form of manufactured, deceptive, and strategic top-down activity that mimics online communication by autonomous individuals. This may be initiated by governments, political parties, individual politicians, or interest groups; the targets may be the general public or another political actor; and the aim may be to generate support for or opposition to specific political actors and issues. Across these different actors, targets, and aims, there are therefore sixteen possible astroturfing configurations.
Tools include sock puppets, click farms, actual sympathisers, and paid supporters; venues may be social media, Websites, comment sections, and direct messages; and actions could include creating content or signalling (dis)approval. Strategies to counteract this may include restrictive (closing down interactive features) or incentivising (promoting honesty) countermeasures. Incentives are generally preferable to restrictions, of course.
Researchers and journalists are increasingly interested in identifying social bots – this is a good start, because these bots are widely used in astroturfing, but they also constitute the low-hanging, easily identifiable fruit in astroturfing. There are far more sophisticated mechanisms for astroturfing that are a great deal more difficult to detect, and therefore may be much more effective in creating a false perception of support for a particular political position or issue. Importantly there is a need to work with the operators of venues that are subject to astroturfing in order to identify astroturfing attempts by using server logs and similar non-public data.