- Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives by Philip N. Howard
by Philip N. Howard
YALE UNIVERSITY PRESS, 2020. 240 PP.
In Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives, Oxford University professor Philip Howard takes us on a journey through the history of the production of "lie machines" supported by the latest scientific research in the field of disinformation studies. In his view, politics is best understood as a sociotechnical system where political actors generate lies that people consume, while the algorithms, data sets, and information infrastructure determine their impact. Howard argues that just as political actors are getting very good at producing big lies, social media algorithms provide an effective way of distributing those lies, and the science of marketing lies to the right audience is improving every day.
Howard defines a lie machine as "a system of people and technologies that distribute false messages in the service of a political agenda" (13). He explains that lie machines have three components: the producers, the distributors, and the marketers. By understanding how they work together, we can devise ways to take them apart or even prevent lie machines from being built.
The book starts with the story of how Russia began its involvement in the distribution of misinformation, even before the well-resourced Internet Research Agency (IRA) came into existence. Howard received from an anonymous source the IRA's social media misinformation strategy, which provides fascinating and detailed insight into the IRA's misinformation operations. As an experienced researcher in the field, Howard presents findings he shared with Congress in the August 1, 2018, US Senate hearing on the role that social media played in the execution of foreign influence operations. Afterward, Howard gives us a fresh perspective on how political lies are distributed on some platforms such as dating apps like Tinder. He points to a group of young activists who created Tinder bots that tried to persuade users to change their political views in the days leading up to the UK's 2017 general election as an example of how the technical affordances of an app can be utilized by tech-savvy individuals to self-organize and influence politics.
To guide the reader in the complex process of distributing lies, Howard walks us through how two marketing firms, one in Poland and one in Brazil, operate to develop fake political identities and seed misinformation across social media platforms. Another interesting insight provided is the result of Howard's investigation into the Brexit marketing campaign's impact. He presents an illustrative example of how the Vote Leave campaign [End Page 225] used personalized social media ads to target citizens just days before the referendum to win support for Brexit. The Vote League ran a series of A/B tests of up to 450 different types of ads to see which were the most effective in persuading undecided citizens. Then they measured how well each advertising campaign performed using different metrics such as gender and region. This allowed them to constantly test and refine campaign messages and identify the most persuasive ones that would influence citizens' votes.
Finally, Howard describes the future of propaganda and how the next generation of lie machines built with artificial intelligence will play a crucial role in the generation and distribution of political lies. For example, fake users will become even more convincing as they will be trained to be dynamic and interactive like our human friends and family, which will be a new, serious threat to our democracy. He reveals how lie machines are currently built from social media algorithms and junk news content (political news and information that is sensational, extremist, conspiratorial, severely biased, and presented as news); however, as a growing number of consumer products are embedded with small sensors, more and more of our behavioral data will be recorded and hence...