Intelligence is insufficient January 31, 2014Posted by shaunphilly in Culture and Society.
Tags: cognitive dissonance, intelligence, perspectives, values, wisdom
Intelligence is a useful quality to have, but it is not enough if we seek things such as wisdom, fairness, or even simply being correct.
I know some pretty smart Christians. I know some people who are smart and yet who still have some pretty dated and conservative views on the world. There are pretty awesome people I meet who react to polyamory unfavourably,and not just as a personal preference. They are able to think, they have impressive cognitive abilities, and yet while talking to them it’s sometimes obvious that they are missing something from their thought process. To the untrained eye, this may look like lack of understanding, but it may not be that simple. 5 or 10 years ago, when my eye was less trained, I would have argued with such people and tried to convince them of my position. Their smart, I’d have thought, and so if I present a solid argument they’ll have to agree with this reasonable belief I have. The problem, here, is two-fold.
First, this presumes I’m actually correct. I may not be correct, and starting as if I am is no help to me nor my interlocutor. If I might be wrong, then starting by trying to convince them of my position will not serve greater understanding or intellectual growth since it will either end in my convincing them of an untruth or of an endless argument where they are the one with the hopefully keener eye to see what we are missing. On top of this, there is a cognitive block that occurs when you argue from a position of “I’m already right,” because it prevents listening. While you argue your points, in such cases, it is harder to see the others’ points being made because our minds will protect our current worldview against dissonant ideas. And really smart people are really good at this worldview-protection, because they can easily and quickly think up rationalizations for why an objection isn’t relevant or right. But by doing this, we miss important facts and perspectives which may be of value to us if we could understand them. You know, just like how you want your interlocutor to think and feel while making your points. Funny how that works.
Therefore, we should start with as neutral a position as possible, and be willing to question every assumption, value, and belief we hold. Also, we should talk to others as if we are willing to do so, because doing so not only looks more open-minded, but actually is part of becoming open-minded.
Second, it presumes that the difference in opinion is one of mere comprehension, when it very well may not be about comprehension at all. The issue may be a difference in values. A difference in values is much harder to shift, for many of the same reasons generated by dissonance theory referred to above, and most arguments I’ve heard boil down not to facts, but values. And while I don’t believe that facts and values are fundamentally different ontologically, they are behaviourally different at very least. That is, a fact is easily proved or disproved, but because a value is part of the process of thinking and behaving, it is harder to see for what it is and how easily it can lead us stray of rational behavior and beliefs.
I believe that a value can be more true than another value (in terms of how it lines up with what goals we share. What goals we should share is another question). A fact is an external reality or claim about said reality which can be checked with empirical and or logical methods. It is demonstrably testable whether this element has those properties, this mathematical proof works, or that lead is denser than water. A value is a fact which is part of the process you use to evaluate other kinds of facts, and thus is generally out of the line of sight for your intellectual powers. More fundamentally values are ideas, which makes them physical processes (ontological dualists can exit through the door, as I have no patience for that shit any more), which also means they are also subject to empirical and logical methods as well (although the exact technique to do such a thing is still quite difficult) and thus values can be measured against reality in a similar way as mere ‘facts.’ I’m willing to submit that values can, therefore, be better or worse than other values. Honesty is better than deceit. Compassion better than harm. And, maybe, the desire for truth is better than the desire for comfort.
Or is it?
Some people don’t care about the truth, in itself. I mean, if you are talking about something as banal and mundane as ‘are you telling me the truth about this drink not being poisoned,’ then people usually care about that level of truth. But what about the willingness to try and learn, grow, and change beyond what is comfortable? What about someone who does not really care what the truth may be, because their faith makes them feel safe and loved? Arguing with such a person about the existence of the supernatural is a wasted effort; they don’t care what’s true. There are smart people who hold such positions, including people that I know and care about. Utilizing intellectual means to try and convince such a person will probably be pointless and frustrating for both of you. They value differently than you, and by applying such a method you are attacking the facts rather than their values. You need to appeal to their values, and doing that by intellectual means is hella hard, and often pointless (but I don’t think it’s impossible).
Or, what about a person who has a moral worldview which you find abhorrent, flawed, or merely not moral? I know quite a few such people, and I do not address why I disagree with them most of the time, because our disagreement is not about facts, it’s about a specific kind of value; preferences.
Morality is not a reasoned activity fundamentally, even if we can use reason and science to improve it and clarify the problems raised by morality’s mantle. Morality, especially where it is codified or systematized, is usually (if not always) ad hoc reasoning. That is, we simply have deep preferences for which we build logical boxes for storage and for hitting our opponents over the head with. Kant, for example, didn’t start from some idealized blank slate of a mind to reach his deontology, his universalization of maxims, rather he had certain preferences and quirks about his mind that made it feel right to do this and not right to do that, and created (brilliantly, mind you) a logical scaffolding to make sense of these brute facts of his mind into a systematized universal standard. I happen to share much of those preferences that Kant seems to have had, so I tend to agree with Kant when it comes to ethics (although I thought he was wrong about many other things, like aesthetics). Where I think Kant erred, in terms of his ethical thinking, was believing that his exercise was a truly intellectual one, rather than one of rationalizing values. The same is true for Bentham and Mill with their versions of utilitarianism, and perhaps even Aristotle with his Nichomachean Ethics (which everyone who is interested in ethics should read, in my opinion).
So, having intellectualized and semantic arguments about ethics is usually completely pointless (not always, mind you). When this type of conversation happens, what we tend to observe is a proxy war for our preferences. The question is not whether my scaffolding is more rationally stable than your scaffolding (I actually really don’t like that game), but whether my preferences themselves actually have better effects on people and in the right ways, and whether (therefore) I might try to shift my values. All too often, we see something like a person whose preferences are more self/freedom oriented arguing with a person who finds consideration and efficiency more valuable, but they don’t address the values themselves. Instead, it turns into a conversation about what “rights” mean or some other epiphenomenal factor, which is less helpful to everyone and merely seeks to put on display rhetorical skills. It’s like lovers trying to hammer out an intellectual solution to feeling unloved; it’s bound to not really help, in the long run, because what the hurt lover wants to just to be loved (it’s a mistake I’m prone to making).
Intelligence is a great tool but without perspective it can often be a blunt tool instead of a sharp one. Perspective requires the spirit of not only a skeptic, but an archaeologist of the soul (‘spirit’ and ‘soul’ used metaphorically there, of course. And yes, that’s yet another set of references to Nietzsche). It’s one thing to use rhetoric, logic, and eloquence to find the flaws in the argument of your opponent, but it is quite another to have the courage to take a hammer to your very psychological and emotional bones. And when a person can utilize whatever level of intelligence they have and work for the character of self-criticism, then a person begins to approach wisdom. Because while we don’t choose our level of intelligence, we have some control (assuming free will is meaningful) over how we use it. The how of our intelligence is more important than its raw power.
Our insecurities will compel us to show off our intelligence. We want respect, love, and friends. And we can get those things if we are (perceived as ) smart. That world is all vanity, the neighbour to fear. Fear is the mind killer, right? And fear has a tendency to create the illusion of confidence or even to actually create arrogance, where practicing intellectual patience instead might be wiser. Because even if we are right, we still might have something else to learn if we are not so ready to be right that we only swing our intelligence outward while not watching for the parry and counterstrike. Also, it does not help to make people like us very much. You may not care about that. I care about that, at least a little. Just don’t make the mistake of allowing your insecurity and fear make you act in such a way that you tell yourself, after the fact, that you didn’t want people to like you when you really did want them to like you. Because that’s a thing that happens. Again, it’s called cognitive dissonance, so read about it.