Pundit

View Original

Understanding Truthiness

How does a post-truth world work? Some psychological findings may be useful. (The Oxford Dictionary definition of ‘post-truth’ is ‘Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief’ The Dictionary labelled it the word of the year 2016.)

This columnist is greatly perplexed by how in today’s post-truth world people hold views or which are not true, which may be contradictory but which are held with a tenacity which belies their falsehood. This is sometimes called ‘truthiness’; the views are believed to be true because they confirm beliefs. But that is a label, what is going on? This is a followup of some earlier writings of mine (and others).

To help me understand I have described below some of the psychological findings of cognitive bias important in behavioural economics.

As far as I know, all have been demonstrated in tightly controlled experiments. It is, however, a leap from the laboratory to application in a post-truth world.

Anchoring: describes the common human tendency to rely too heavily on the first piece of information offered (the ‘anchor’) when making decisions. Once an anchor is set, there is a bias toward interpreting other information around the anchor. Example: A subject is given a random number (say the outcome of a spin of the chocolate wheel) and subsequently asked to estimate a fact (say the number of countries in Africa). The estimate is affected by the random number. Comment: That means that the information that one is given may influence one’s beliefs about another matter, even though it is misleading or irrelevant. By choosing seemingly related information the presenter (often a politician) can greatly influence belief even more.

The endowment effect: People ascribe additional value to things merely because they own them. Example: Experiments involve mugs which people are given randomly; subsequently the owners value their mugs more than those who did not receive them. Here is a more complicated example as set out by John Key in John Rougham’s biography. ‘Most people take their profits too early and cut their losses too late. If they buy a house for $500,000 and a month later somebody offers them $600,000, it is human nature to take the money and dine out on their good fortune. Conversely, if they put that $500,000 house on the market and the best offer it brought was $350,000, they would hold onto it. A good dealer would not. As soon as he realised the asset was losing value he would get what he could for it and put the money into a new, hopefully better, investment.’

Framing: When one seeks to explain an event, the understanding often depends on a frame of reference. Example: The way a willingness to donate organs is asked affects responses. Austrians have an opt-out provision; Germany has opt-in. Almost all (99%) Austrians agree to donate after their death, but only 12% of Germans. Another example is a VUW earthquake study in which participants judged more risky 1600 dead in 500 years over a 10% chance of 1600 dead in 50 years, despite the two being logically equivalent. Comment: Those who structure the reference frames – the way the choices are offered – influence the responses.

Loss aversion: The tendency to prefer avoiding losses to acquiring equivalent gains. Some studies have suggested that losses are twice as psychologically powerful as gains. Example: People judge it better to not lose $5 than to find $5. Comment: In their evaluation of the impact of policy changes, economists value losses offsetting a gain at one-for-one. Many favourable policies under this tradeoff might be unfavourable if the two-for-one ratio was used. Thus trade deals without compensation may not be as overall favourable as is claimed.

The planning fallacy: Predictions about how much time will be needed to complete a future task display an optimism bias underestimating the time needed. Comment: One hardly needs examples.

Heuristics: Simple rules which people often use to make decisions when optimal decisions require complicated calculations. These rules work well under most circumstances, but they can lead to systematic deviations from logic, probability or rational choice theory. Comment: Economists tend to assume the heuristics are near optimal They are not always. Presumably heuristics are more important when one is ‘thinking fast’ (as Daniel Kahneman described it).

Hyperbolic discounting: In economic terms it leads to time-inconsistent decisions: that is, with no change in information the individual changes their decisions through time. Comment: This is for a column in its own right.

Prospect theory brings together many of the above phenomena. It is complex, but people make probabilistic decisions in a systematic way, but not in the way that is assumed by rational economic man. Comment: Kahneman won the 2002 Nobel Memorial Prize in Economics for his work, with Amos Tversky, developing prospect theory.

The following list is of cognitive biases which may not seem particularly relevant to economics, but may be relevant to understanding what is going on in politics. (They also have relevance in commerce. For instance share traders who were over-confident do worse on the long run in making profits.)

Confirmation bias: The tendency to search for or interpret information in a way that confirms one's preconceptions. 

Hindsight bias: Sometimes called the 'I-knew-it-all-along' effect, is the inclination to see past events as being predictable and even inevitable, including events that they previously saw as unlikely; the US election result is an example. .

Self-serving bias: The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.

Status quo bias: An emotional bias preferring the current state of affairs. Comment: Since the state of affairs is changing it means people tend to adjust too slowly to a change.

Belief bias: When one's evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion. Comment: Truthiness.

Some of the above apply strongly to the angries. Most obviously, if there is loss aversion they may be twice as angry as the beneficiaries feel pleased about a policy change.

Second, the endowment effect, loss aversion, hyperbolic discounting, prospect theory and status quo bias suggest that people are inherently conservative. Yet the status quo is being continually undermined; we are often ill prepared to cope with that change.

That is where the political rhetoric comes in. Use of a number of the above ideas such as anchoring, framing, confirmation bias and belief bias can be used to shape views into a way which is not necessarily coherent. (While our interest here is political debate, there are few in marketing who do not use these principles.)

Does this progress our understandings of the state of populist reactions? The puzzle is, though (to adapt the Red Queen slightly), how can some people – Trump supporters are an obvious example – believe as many as six contradictory things before breakfast?