Even a very superficial look at our collective life reveals the persistence and even vividness of collective credulity. Why?
The idea that the boundaries of the empire of belief are defined by irrationality, stupidity, or lack of education is an old moon in the history of thinking. One can find it in Montaigne, Bacon or among Enlightenment thinkers who see ignorance as the source of any credulity. This interpretation allows the dream of a society freed from the abuses of belief by the light of education in particular. We can concede without discussion that the increasing level of education, the massification of access to information and the development of science have helped to eradicate all kinds of misconceptions from public space. Thus, though our representation of the birth of the universe is still metaphoric, we imagine it more easily as the consequence of a Big Bang rather than as the result of the separation of two titanic beings like in the Babylonian Enouma Elish. Yet even a very superficial look at our collective life reveals the persistence and even vividness of collective credulity. Why did the predictions of Enlightenment thinkers and many of those who succeeded them on this point prove false? Two questions need to be distinguished here: why beliefs usually persist, and why they have a great vitality today in particular. Both questions are exciting, but only the second will be the object of this article, which will present some of the mutations of believing, primarily in the way in which our contemporaries can access information that will help them to feed their conception of the world.
It can be said that the information market in contemporary Western societies has been massively deregulated in particular since the Internet appeared. Let us think for a moment: in 2005, mankind produced 150 exabits of data, which is considerable; in 2010, it produced eight times as much! To sum up, it is spreading more and more information, and in such proportions it is already a major historical fact in the history of mankind. But could we think: there is more and more information available, much better for democracy and much better for knowledge, which will end up imposing itself on everyone’s minds! This view seems too optimistic. It assumes that in this open competition between systematic beliefs and knowledge, the second will necessarily prevail. However, faced with this market-wide offer, one can easily be tempted to compose a representative, rather than true mental world. In other words, the plurality of the proposals being made allows us to avoid at least the mental discomfort often associated with the products of knowledge. The supply explosion facilitates the plurality of cognitive proposals on the market and their greater accessibility. The least visible and yet most decisive consequence of this state of affairs is that all the conditions are then met so that confirmation bias can give full measure of its ability to deflect us from the truth. Of all cognitive temptations weighing on ordinary logic, confirmation bias is probably the most determinant in processes that perpetuate beliefs. Confirmation bias makes it possible to strengthen all kinds of beliefs, from the most innocuous – like the superstitious fads that can only be anchored in us because we make efforts to keep only the happy facts that would have favored a particular ritual – to the most spectacular ones. Indeed, there is often a way to observe facts that is not inconsistent with a dubious statement, but this demonstration has no value if the proportion or even the existence of those who contradict it are ignored.
If this appetite for confirmation is not an expression of objective rationality, somehow it makes our life easier. The process of reversing is probably more effective if our goal is to seek the truth, because it reduces the probability of considering as true something that is actually false. But as Friedrich (1993) points out, reversing requires an investment in time and mental energy that can be exorbitant. In essence, social actors accept some objectively questionable explanations because they seem relevant, in the sense that Sperber and Wilson (1989) have given to this term. In competition, they explain, we will opt for the proposal that produces the greatest cognitive effect for the least mental effort. Since beliefs often offer solutions that follow the natural slopes of the mind, and since they rely on confirmation bias, they produce a very beneficial cognitive effect in relation to the mental effort involved. Once an idea has been accepted, individuals, as Ross and Leeper (1980) show, will persevere in their belief. They will do so all the more easily because the increased and non-selective dissemination of information makes the meeting of “data” more likely to confirm their belief. A study conducted in 2006 focused on blogging readers; not surprisingly, it showed that 94% of the 2,300 respondents only consulted blogs that matched their feelings. Similarly, purchases of political books on the Amazon site are made, more and more, according to the political preferences of buyers. It is now well known that algorithms, especially on social networks, contribute to the creation of cognitive insularity in this ocean of information. All this makes it possible to deduct the theorem of information credulity based on the fact that the selective search mechanism is made easier by the massification of this information. It may include: the more unselected information will be in a social space, the more credulity will spread.
Beyond the confirmation bias, one might wonder what an Internet user, without preconceived ideas on a subject, might encounter as an Internet view about a vector of beliefs, if he used the Google search engine to make an opinion. I tried to simulate how an average Internet user could access a certain cognitive offer on the Internet on several topics: astrology, Loch Ness Monster, crop circles (large circles that mysteriously appear, usually in wheat fields), psychokinesis… These proposals seemed interesting to test since scientific orthodoxy challenges the reality of the beliefs they inspire. There is no need to wonder about the truth or the falsehood of these statements, but only to observe competition between answers that can claim scientific orthodoxy and others that cannot (reason I appoint them, to simplify, “beliefs”). They therefore offer an interesting observation position to assess the visibility of questionable proposals.
The results are indisputable: if only sites defending favorable or unfavorable arguments are taken into account, on average, there are more than 80% of “believing” sites in the first thirty entries proposed by Google on these subjects. How do we explain this situation? It happens that the Internet is a cognitive market hypersensitive to the structuring of the offer and that any offer is dependent on the motivation of the offerors. Believers also tend to be more motivated than non-believers to defend their views and devote time. Belief is a part of the believer's identity, and it will be easy to seek new information that will strengthen its consent. The non-believers will often be in a position of indifference, they will refuse belief, but without needing another justification than the fragility of the statement they revoke. This fact is also tangible on Internet forums where sometimes believers and non-believers oppose each other. Of the 23 forums that I have studied (all four beliefs included), 211 views are expressed, 83 defend that of belief, 45 fight it and 83 are neutral. What stroke me when I was studying the forums is that skeptics often content with ironic messages, laughing at the belief rather than arguing against it, while the defenders of the statement call for uneven arguments (links, videos, paragraph copied/glued…), but support their point. Among the posts proposed by those who want to defend belief, 36% are supported by a document, a link or an argument developed, whereas this is only 10% of the cases for non-believers posts. Scientists in general do not have much interest, academic or personal, to spend time on this competition; the somewhat paradoxical consequence of this situation is that believers, on all sorts of subjects, have succeeded in creating cognitive oligopoly on the Internet, but also on certain themes (including risks: GMO, low frequency waves etc.) in the official media that have become ultra-sensitive now to heterodox information sources.
I do not think it can be said that the Internet makes people dumber or smarter, but its very functioning undermines certain ways of our mind and organizes a presentation of information not always favorable to orthodox knowledge. In other words, free competition of ideas does not always favor the most methodical and reasonable thought.
All the same, the conventional media are now trapped in the unbridled competition in the information market. This competition drives a rate of dissemination of information that does not always accompany that of knowledge, because it reduces the time for verification of information and leads to a mutualization of errors that will pass for common sense. This is particularly evident in the area of perception of risks where one can witness, in the health and environmental field, the spread of an ideology of fear that is not always scientifically founded. Indeed, a health alert issued by an association with good intentions can have adverse consequences because it will require much more time to undo this alert (when it is unfounded) than it has been needed for the media to broadcast it. This is particularly the case with respect to the distrust of vaccines that is spreading around the world, while vaccines are probably one of the most remarkable contributions of modern medicine to public health. In other words, these conditions will organize, on certain subjects, a viral advantage to credulity. In an information market that is becoming hyper-competitive, those whose job is to distribute information can only survive if they can manage to attract attention. In these circumstances, it is not incomprehensible to observe a generalization of cognitive demagogy, i. e. an offer of information that is increasingly indexed to the nature of demand. But everyone is well aware of living in a post-truth society and that contributes to a situation of widespread mistrust: distrust of policies, mistrust of the media, mistrust of experts, scientists… The distrust of power, in particular, is consubstantial to democracy as Rosanvallon (2008) reminded us, but in the iron arm that has started between the democracy of gulls and the democracy of knowledge, it comes to reinforce the former, rather than the latter.
Bronner, G. (2013), Belief and Misbelief Asymmetry on the Internet, London, ISTE, 2015 (original French version 2013).
Bronner Gérald (2006), Vie et mort des croyances collectives, Paris, Hermann, 2006.
Bronner, G. (2011), The Future of Collective Beliefs, Oxford, Bardwell Press (original French version 2003).
Friedrich, J. (1993), « Primary detection and minimization strategies in social cognition: a reinterpretation of confirmation bias phenomena », Psychological Review, 100, 2, p. 298-319.
Rosanvallon, P. (2008), Counter-Democracy: Politics in an Age of Distrust, Cambridge, Cambridge University Press.
Ross Lee et Leeper Robert (1980), « The perseverance of beliefs: Empirical and normative considerations », in News Directions for Methodology of Behavioral Science: Faillible Judgement in Behavioral Research (Shweder et Fiske eds), San Francisco, Jossey-Bass.
Sperber, D. et Wilson, D. (1989), La Pertinence. Communication et cognition, Paris, Minuit.