Politicians increasingly rely less on facts and data and alter the truth
By Teodor Teofilov
When democracy is in crisis, we look for truth in facts. When voters are manipulated and politicians refuse to answer basic questions, facts are our only lifeline. However, they are losing their ability to unite us around a consensus.
PolitiFact showed that around 70 percent of the “factological” speeches of Donald Trump during his presidential campaign, fell into the categories of “possibly untrue”, “false” and “blatant lie.” Nonetheless, he proved himself a serious political factor by defeating Hillary Clinton and becoming president of the United States.
During the Brexit campaign, the Leave camp held firm their stance that the UK lost £350 million ($468 million) a week from its membership in the European Union (EU). They didn’t account for the amount being brought into the domestic economy through the benefit of being part of the Common Market. Nevertheless, Britain has triggered Article 50 and will leave the EU soon.
While the battle for votes is intensifying with the 2018 midterms, the use of facts in public debates has increased exponentially. We have high expectations in statistics and expert opinion, thus we overload them. Instead of facts being out of the scope of political debate, they became one of the main rhetorical weapons. How can we still talk about “facts” when they haven’t shown reality for a long time?
The problem is that experts and organizations manufacturing facts have increased. A large portion of them work upon the request of their clients. If you are looking for a sociologist, who can support your view, all you need is enough money and you will find him. A perfect example of this is Cambridge Analytica, which used Facebook data to profile over 80 million US citizens. In this case, the company didn’t make “fake” facts but instead created a profile on the American electorate for the use of private entities, such as the Trump presidential campaign.
The combination between populist movements and social media are often the main suspect for the rise of politics where the truth is meaningless. Individuals can determine the rules of the media content they absorb and adjust it to their own beliefs. This often leads to reinforcing people’s beliefs as they don’t get information from sources with different points of view. This model is actively encouraged by populist leaders as it allows for disinformation and bending of the truth. However, if we blame only the latest and most outrageous misuses of truth, we would be wrong: the power of facts has been in decline for some time now.
The real problem lies in the oversupply of facts in the 21st century. There is an overabundance of sources and methods, and the credibility of information is too unreliable – everything depending on the financier of the study and the way the shocking statistics have been cherry picked.
One of the most blatant uses of disinformation is where one study, which was conducted in violation of scientific methods, led many to believe there is a connection between vaccinations and autism. In 1998, Andrew Wakefield, M.D., along with 12 co-authors, published a case study in the Lancet claiming that they found evidence, in many of the 12 cases they studied, of measles virus in the digestive systems of children who had exhibited autism symptoms after MMR vaccination.
Most of the authors later retracted the conclusions of the study and in 2010, The Lancet formally retracted the paper itself, as extensive studies over the next 12 years showed no merit to Wakefield’s conclusions. In television interviews, in 2004, then-editor Dr. Richard Horton of the Lancet claimed that Wakefield’s research was “fatally flawed.”
Mary Poovey, an American cultural historian and author of “A History of the Modern Fact”, thinks that the trend of describing society in numbers, percentages and sums dates back to the Middle Ages, when accounting came into being. Keeping of financial statements in trade presents a new type of self-sufficient “truth”, it required neither interpretation nor blind faith on the part of the person who reads it.
William Davies, an associate professor in political economy at Goldsmiths, University of London, and the author of “The Happiness Industry: How the Government and Big Business Sold Us Well-Being,” wrote for the New York Times that the 20th century laid the basis for a new business – the business with facts. Market research companies started exploring consumer attitudes in the 1920s soon after they implemented polls of public opinion.
The first think tanks came about during the Second World War and were first used in military jargon to describe a safe place where plans and strategies could be discussed. However, the meaning of the term began to change during the 1960s when it came to be used in the US to describe private nonprofit policy research organizations. These organizations make use of statistical and economic methodology in shaping new management policies, usually in favor of one or another platform.
According to Davies, the idea of politics, based on proof, which was popular with liberal politicians from the end of the 20th and the early 21st century, economists raised to assessors of management programs in an era which is called “post-ideological”. Of course the notion of “facts” doesn’t only refer to numbers. It implies a kind of knowledge that can be reliably grounded to the public without the need for permanent update or interpretation.
Between a society of facts and a society of data
The world is currently in a transition period that has led to a confusion of the meaning of knowledge and numbers in social life. It causes the irrefutable feeling that the truth, itself, is disappearing. But how did this shift happen?
In order to find that out, we need to start with the invasion of “smart” technologies in everyday life and the internet. Thanks to the smartphones in our pockets, the revolution of social media, the rise of internet trading of goods and services and the deployment of sensors in public areas – we are creating a huge amount of data.
Similar to statistics and other traditional facts, this data can be displayed in numbers. The difference is the unprecedented volume and the fact that they are constantly collected, by definition, and not by means of an expert methodology. Data is generated at such a rate that it is difficult to keep track of it. Every website stores vast amounts of information about each visitor. That personal data print, left by each internet user online, can be used to get an idea of the behavior and opinion of people.
The facts should be the guarantee for resolving disputes between warring camps and should simplify problems. Parties can argue “for” or “against” right-wing economic policies, but if they agree that the GDP has increased by 2 percent in 2017 or if the unemployment rate has fallen to an all-time low of 4 percent, then there is at least one shared stable reality, where there can be no dispute.
Data, unlike facts, should detect changes in public opinion. Analyses of the content in Twitter, Facebook or any other social media can provide real-time information of public opinion of certain politicians – a method known as analysis of perceptions. A similar method is the study of the reactions of the audience in real-time while a debate between presidential candidates is happening on television.
Financial markets show the changes in the perception of traders. Stock exchanges never gave “factual” estimates of the real value of one or another company as accounting can give it. They simply represent a momentary snapshot of perceptions of thousands of people worldwide.
Journalists and politicians cannot afford to undervalue this constant check of public perceptions – the same way managers cannot afford to ignore price fluctuations of their company’s shares. If the British parliament had put in a bit more efforts to find out the public opinion of its people for the EU, instead of repeating the same facts of the benefits of its membership in the Union, then it is possible that the Stay campaign would have been completely different and possibly successful.
It is possible to live in a world of data without facts? Think about how you take the meteorological weather prediction: we understand that the assumption of 25 degrees Celsius (77 degrees Fahrenheit) on Sunday isn’t a fact and this number may change. Weather forecast works similarly to perception analysis, by collecting multiple data from sensors, taking into consideration historical analysis of similar data and translating them in a constantly changing tale of the near future.
The prospect for politics is worrisome. When numbers are considered as indicators of current mood and not as an indicator of reality, then how is it possible to reach a consensus on the causes of social and economic problems? And what about the possible solutions?