The ongoing blogbattle over whether to be nice or a dick skeptic continues. Phil Plait gave a speech suggesting that niceness works better. There was blowback, of course, which he discusses here and here. The Great Tone Debate seems to resolve down to those who think that minds are changed mostly by civil debate, and those who think that one ought to be a dick, and be angry and outraged. I name no names. If you know what I’m talking about, then you know who I’m talking about.
But is it even important? I fully agree with what Phil said, but does reason work?
Consider the question Phil asked: what is our goal? Why are we being pro-science and critical skeptics?
So far as I am concerned, I want our society to allow science and reason without it being either constricted by the beliefs of others, or forcing people into it. The reason why I don’t want to force people to be reasonable is because that’s like forcing someone to have morally pure thoughts: it is something of a contradiction in terms. People will be reasonable if they are shown how, and they (more or less reasonably) estimate that it is in their best interests. Simply asserting, in the classroom or in the media, that reason is the One True Way to Enlightenment fails to get over the threshold and out the door.
So, if that is my goal, how do we achieve it? I think Phil is right that you will convince more people by a steady and polite response than by calling them idiots and fools (“When two principles really do meet that cannot be reconciled, each man calls the other a fool and a heretic”: Wittgenstein in On Certainty), but I take issue with one of his and many others’ major premises: that we even can reason people out of their false beliefs about the world.
This is not some postmodern claim, and it is not a claim that people are irrational panicky animals, contrary to Z in Men in Black. Sure, sometimes they are, but that’s not what we are talking about. We are talking about how beliefs are formed, and how to make them conform to the world as best we can. Dickness fails the test – you may have the right to get angry and abusive, and others may lack the right not to be offended (I agree with both of these, by the way), but is it a good bit of practical wisdom? How are beliefs formed?
We have run this debate on the assumption, that very Enlightenmental assumption, that people are swayed by reasons and dialectic, and this is just not the case most of the time. The alternative, however, is that people can be swayed by rhetorical flourish, and while this is true, it is equally true for the nutbars as for the skeptical or informed, and indeed that is the very reason why antivaccination, global warming denialism, and other such anti-realist views are so popular (as documented in Oreske’s and Conway’s Merchants of Doubt); so we want to use this sparingly and rely on the critical reasoning that produces science and eliminates woo.
But beliefs are not in general formed from reason. On the other hand, it is my observation that rhetoric serves only to organise those who are already gathered about that banner, perhaps at best tipping those who are already inclined in a particular direction to slide into the sandtrap. Both reason and rhetoric affect attitudes, but I think they are poor ways to influence basic beliefs. Once you are already a skeptic, or a believer, they affect which particular beliefs you may own, but your core beliefs are not arrived at that way.
Instead, beliefs are acquired as you develop and become enculturated. I have a paper in Synthese in which I argue that due to constraints on reasoning, people employ some dispositional heuristics to decide what to believe. Fundamentally, we are born with these “fast and frugal” heuristics because none of us have the time to reflect on everything carefully and to any semblance of rational equilibrium. So we do a few tricks. Now there are two broad fields of tricks we do. One is tricks for dealing with our ecology: things that stop us from dying or failing to reproduce. Call these ecological heuristics; they are the kinds of things that Gigerenzer and Todd study in the last link. The heuristics include: “take the best [example]”, “recognition”, and using simple cues to reduce the options rapidly. This is something that allows us to negotiate the physical world. For example, we should “take the best” from our social context, because, quite simply, those we are emulating do well.
But there is another domain to which we have to adapt our beliefs, and it is the social domain. We use social heuristics to decide what to believe in a social context and here the heuristics are less reliable if they are neutral with respect to our survival and reproductive capacities. We tend to believe what everyone around us believes, for example. Why? Well it is because they are not dead, and people who are not dead are unlikely to impart beliefs to you that will make you dead. Moreover, since a leading cause of being dead are people who disagree with you, it pays to believe what those around you believe.
This is a way of farming out the doxastic labour to specialists, a division of cognitive labour. [“Doxastic” is philosophese for “opinion-related” or “belief-related”). It works for ecological rationality – you take the bow maker’s expertise for granted when learning to make a bow. They have centuries of tradition behind them, and so their bows will be much better than yours are likely to be. This means better dinner, more often.
But when the lack of deadness is not correlated with the beliefs, when beliefs are so distant from everyday exigencies that one can carry quite a load of false beliefs, then the social cues fail to provide true beliefs. My life is not going to depend on the correctness of the Copernican theory, for example. Take this passage from A Study in Scarlet:
His [Holmes’] ignorance was as remarkable as his knowledge. Of contemporary literature, philosophy and politics he appeared to know next to nothing. Upon my quoting Thomas Carlyle, he inquired in the naivest way who he might be and what he had done. My surprise reached a climax, however, when I found incidentally that he was ignorant of the Copernican Theory and of the composition of the Solar System. That any civilized human being in this nineteenth century should not be aware that the earth travelled round the sun appeared to be to me such an extraordinary fact that I could hardly realize it.
“You appear to be astonished,” he said, smiling at my expression of surprise. “Now that I do know it I shall do my best to forget it.”
“To forget it!”
“You see,” he explained, “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things so that he has a difficulty in laying his hands upon it. Now the skilful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.”
“But the Solar System!” I protested.
“What the deuce is it to me?” he interrupted impatiently; “you say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.”
Of course, most of us know things that are not of direct use and import, but typically we pick them up as part of our cultural traditions, not from any reasoned deliberations. While our brains may not be rigid timber rooms as Holmes describes, the cost of populating those rooms is inelastic. We develop our impractical ideas through cultural osmosis. We grow into them.
Our social heuristics include a number of strategies that reflect our social biological nature. One is related to “take the best” – I think of it as “follow the famous”. High status individuals have access to resources we ordinary low status peons do not, so you should emulate whatever of them that you can. Who knows but carrying a small dog in a handbag maybe the route to status. Another is “accept the authorities”. By this I mean that you should believe whatever the most respected figures in your social context believe to be the case or the right thing. This is a variant on the “not dead” rationale, but also means that if the high status individual is high status in virtue of having succeeded doxastically, you improve your chances by copying them.
But one heuristic trumps everything in both ecological and social rationality: personal experience. Call this the “I believe it when I see it” heuristic. A simpler term might be “learning from experience”. No matter who tells me what, if I have experience to the contrary I am going to let it carry the day. The trouble is that little of my experience will tell one way or the other for the majority of socially derived beliefs.
So, I argued in my Synthese paper, someone who lacks experience early on will tend to invest, so to speak, in the consensual views of their immediate social context and community unless they experience things to the contrary. It’s a good conservative strategy for forming beliefs. Once you have your belief set, or basic beliefs, or doxastic values, however you want to characterise them, reason will run up against a wall and rhetoric will tend to reinforce it; hence Phil’s problem.
What to do? I argued in the paper that the only real solution was to ensure that in education, particularly early education, one should encourage experiential learning rather than factual learning, at least in science. If you spend you time making things blow up reliably, and finding out first hand what the structure and properties of things are (ordinary, everyday things are best, because that will undercut intuitions derived from social consensuses), then you will come to accept the reliability of empirical learning and experimental study, so that when it is challenged from an ideological perspective, the student will be less inclined to fall back on “follow the famous” and “accept the authorities”. If Jenny McCarthy (both famous and an authority for that community) tells you that vaccines are dangerous, and you are given experimental reason to think that it isn’t, then you will not farm out doxastic tasks to the wrong “specialist”.
I think that we should not teach scientific facts as such until quite late. Let kids name parts of organisms by direct observation. Let them mix chemicals under supervision so they make small explosions (few chemists will have been inspired by colour changes, and many will have been inspired by making things go “boom!”, I warrant). Start teaching them the factual stuff only when they have already come to trust the scientific process.
So, as the religious, doubt manufacturers, and conspiracy theorists know intuitively, the real battlefield is education, not the media. If you can dampen down experiential learning, all the public debate and discourse resolves down to duelling authorities, and competing communities. Evolution, medicine, astronomy and so forth all become “just another religion”. I sometimes suspect that the drive to general curricula, and the subsequent uniformity of teaching to the tests, is a clever ruse to prevent students from actually experiencing science and learning in general.
The real debate is over who gets to control, or not control, education, just as the real debate over morality is about who gets to determine who can mate with whom. And therefore I think that the most pressing goal of skeptical thinkers and the pro-science is to ensure that education is not interfered with or controlled by special interests and lobbies. And that debate must happen in the public sphere, urgently. And yes, our only weapon is reason there. You can’t convince a creationist that evolution happens through simple recitation of facts and argument, but you may be able to ensure that those who set the law and policy for education come to see, through argument, that biases in education will have bad outcomes.
The extremism of modern debate is dangerous. It tends to make the options impossible and irrational. I am very concerned about this post on how the irresponsibility of both conservative and liberal commentators in part contributed to the failure of the Weimar Republic. It can happen again.
Oh, and as to Tone, I think it is just good manners to treat your discussant with politeness and respect, until they get nasty. And then you have the right to walk away. Being a dick is something that you only need to do when you do not have that option.