Last updated on 20 Feb 2014
[Apologies this took a while; I’ve been rather sick]
So, given all this [Why believers believe silly things, why they believe the particular silly things they do, and the developmental hypothesis of belief acquisition], how can you change a believer’s mind? It is tempting to say that you cannot, or to take a more rationalist perspective and think that more argument is all that is needed, and both views are often put. But, as we might expect, the situation is a bit more complex than that.
First of all there are two distinct questions here. One is the individual question: how can we change a particular individual’s beliefs? The other is the communal question: how can we change the overall reasonableness of a given group or population? These are different questions with different answers.
The individual question has no general answer: it depends upon the individual’s belief-set, and how coherent it already is, and whether or not they are sensitive to experiential challenges (that is, if they are in a crisis). A believer who has a relatively well-cohering set of beliefs, with no real internal conflicts of note, but who is in no personal position of challenge by experience, is relatively immune from rational argument. If they face empirical challenges (their beliefs do not match with the world they are experiencing, as in the classical study of the failed millennialists by Leon Festinger and colleagues (Festinger et al. 1956)), one solution is to deny the facts, another is to to reinterpret the peripheral or less weighted beliefs to save the core beliefs, and a third is to reinterpret the core beliefs so that they are not challenged by the facts. All three strategies can easily be found. For example, global warming denialists will challenge the facts. Creationists will allow some facts but reinterpret them or the ways they are handled by creationist thinkers. And my favourite case of core reinterpretation is the reaction of the Catholic church to Daltonian atomism and chemistry: change the interpretation of a core belief in substance in the doctrine of transubstantiation from a physical reality to a metaphysical reality (thereby partly conceding to their Lutheran critics of 400 years earlier).
When these things happen, believers will usually deny that they have happened (Schmalz 1994), like the historical revisionism in Nineteen Eighty-Four, where the state goes to war with a new enemy and now tells its pliable population that “We have always been at war with Oceania”. These three strategies are increasingly schizoid. Reinterpreting the core beliefs to accommodate new facts is a healthy response to the world, leaving only questions of group identity marking (we do not agree with those Lutherans, they are heretics!). The Church has accepted (belatedly) the scientific virtue of Galileo, Dalton and Darwin.
The revision of peripheral beliefs is more strained. When [honest] creationists spend time trying to accommodate the facts of biogeography, biodiversity, genetics and dating techniques, they may find their “hypothesis” dying what Flew called “the death of a thousand qualifications”, but so too do defenders in science of outmoded hypotheses, and there is no threshold at which it becomes irrational to hold those beliefs. Nevertheless, like pornography, we can recognise irrationality when we see it. The rationalist approach to argument, however, behaves as if there is, or ought to be, a line that one should not cross. This leads to interminable “debates” of claim and counterclaim, which rarely result in any resolution.
The third approach is to simply deny the facts. This can be achieved by adjustments to the reliability of those who we disagree with (ad hominem attacks, for instance, on the probity of climate scientists). Both believers in pseudoscience (like Bigfoot or homeopathy) and anti science (such as creationism or anti-vaccination) find methods of calling into question the facts themselves.
Now as the response becomes less grounded in the empirical, reasoning becomes much more difficult, until you reach a stage where no reasoned argument is possible. But this is determined by the strategies adopted by the believer, not by the subject or belief they hold. Homeopaths can be argued out of homeopathy, and Catholics can still hold stubbornly onto the view that the Host really is blood and flesh, and that chemists are just anti-Catholics. So it depends upon the individual. If the core beliefs are cognitively entrenched, then they are less likely to undergo any kind of rational or empirical revision. [As a side note, one often anecdotally hears of a believer in homeopathy or some other “complementary medicine” who abruptly adopts empirical medicine when it is their child or loved one who is suffering. This is a very personal crisis. However, it can also drive the believer deeper into the silly belief, as Festinger noted.]
At a group level, however, things are even more complicated. Here what counts includes the institutional structure of the belief-group. The plasticity of the group itself will help determine whether the group adapts or digs in further: the more authority-driven the group, and the more exclusionary it is to those who deviate even slightly from the approved belief-set, the less it will change. And another issue is group size. The Catholic Church, for example, while supposedly hierarchical (indeed, the very term hierarchy was taken from its military-style structure of command and constraint; it means “rule of priests”), has been very fluid in its interpretation of its core beliefs. In large part this is because the Church is not small and there are many de facto command structures apart from the clerical. The Jesuits, for instance, played a great role in adopting, refining and making viable scientific acceptance within the Church, even as others were pushing for a return to older, conservative, beliefs. Christian, Jewish and Islamic doctrine has been in various ways able to adapt to new science and new social conditions (as Harnack showed in great detail in his classic History of Dogma in the late nineteenth century).
But some generalisations can be made. One is, that the more a belief-group is reliant upon authority figures to tell believers what they should believe, the less fluid the tradition. This is, as I argued in the paper on rational creationism [mentioned in the last post], due to a kind of doxastic [that is, belief] division of labour. Most of us have little time to test and become familiar with the technical ideas of science, for instance, and so we rely upon authorities. But the authorities we select to rely upon depends a lot upon what belief-group we are in. We choose to believe our authorities over theirs. As I argued, this is because, evolutionarily speaking, they aren’t dead yet. Having their beliefs may have a cost, but that is offset by the benefit of savings in time, effort and resources of taking ready-made ideas off the shelf. We have a disposition to adopt the views of those we grow up around, because it is economic to do so, and adopting those views won’t likely kill us. Only when we reach a crisis state do we challenge those authorities, and even then we will tend to do so piecemeal until we reach a (personal) threshold of incredulity.
Another depends upon the degree of engagement we have with the wider society in which our belief-group is located. Even the Plymouth Brethren must deal with teachers, the media, and popular culture that is right there on the shelf in the bookshop. Messages that conflict with our belief-set can reach another (personal) threshold that we find challenges our core beliefs. When that happens, we may find a crisis that causes a rapid conversion (or de-conversion) in core beliefs.
This is why one of the major areas of battle between belief-groups lies in the control and amelioration of these challenges in education. If you can introduce some doubt about the strength of, say, evolutionary biology among younger children, it is rational (in a bounded sense) for them to stick with the core beliefs of their belief-group. Only if evolutionary biology (or whichever other topic is at issue) is presented firmly and without competing beliefs in educational contexts will it begin to undermine the authority structure of the student’s belief-group. As I argued in the creationism paper, sufficient challenges will tend to sway the average developmental trajectory of a believer away from the hard-core or exclusive belief-set of the belief-group. The population as a whole becomes more accommodationist.
This leads to my final point: herd immunity. In vaccination, when a sufficiently high number of the population has been immunised, the epidemiology of the disease being vaccinated against reaches a point at which the likelihood of infection among the unvaccinated (the very young, for instance) is very slight. Beliefs behave like pathogens (a metaphor that has been widely abused, in my view) in that since we take our belief cues from the experienced social norms, when those norms are reasonable ones, unreasonable beliefs tend to founder, and so this sets up a selection pressure in the evolution of beliefs for beliefs to be not too weird, or they isolate the believer too greatly from the social context in which they live. Sufficient education in reasonable beliefs forces many silly beliefs, or at any rate those that have real world consequences, to become less silly.
Anyone who understands population genetics will realise that this does not mean that the entire population will become reasonable as such. In genetics and in epidemiology, the ratio of beneficial to deleterious variants will reach a tradeoff point, called an evolutionarily stable strategy. In economics, this is called a Pareto optimal point. To increase one variety will lower the average fitness of the population, and so the two variants will remain in a set balance until external conditions change. It is for this reason, for example, that I do not think religion will “disappear” as many rationalists think it will. There are group benefits to religion, and even in the most secular society, until the costs of being religious exceed those benefits, religion as an institution will persist.
So in order to ameliorate the supposed evils of religion (or conservatism, pseudoscience, radicalism, etc.), the best strategy that those whose ideas are empirically based can take is, in my view, to resist attempts to dilute science and other forms of education. This sets up a selection pressure against extremist views. Similar approaches might be taken in what Americans call “civics” classes to deal with political extremisms, and so on.
To conclude, I should make the following point: I am not suggesting that I alone am ideologically pure and coherent in my beliefs. Anything I say in general must apply to me also (this is why one of the objections to Marxism is that somehow Marx exempts himself from false consciousness). So I assume that I, too, will have conflicting belief subnetworks, and so one of the reasons why I put these thoughts out here is to get the same kind of correction from the wider community that I expect those I have used as examples here require. I am a radical (increasingly as I age), conservationist, small-l liberal of the Millian variety, agnostic and very, very, pro-science. I expect I have more than a few of my own shortcomings. As a friend once said of me, I am like a hunchback who cannot see his own hump, but sees everyone else’s. I expect this. But I think this analysis is roughly in the right region.
Festinger, Leon, Henry W. Riecken, and Stanley Schachter. 1956. When prophecy fails. Minneapolis, MN, US: University of Minnesota Press.
Schmalz, Mathew N. 1994. “When Festinger fails: Prophecy and the Watch Tower.” Religion 24 (4):293-308.