Sunday, April 16, 2017

Truth and Rallying


“The best lack all conviction, while the worst are full of passionate intensity” -- Yeats, “The Second Coming”

A new study suggests that “intellectual humility,” defined as the serious practice of the idea that you could be wrong, is one of the most important traits of people who make good decisions.  

The finding makes sense; being open to new evidence, even if it’s contrary, is a key part of learning.  When we stop learning, we freeze our abilities at a certain point, but the world keeps moving.  

In a sense, intellectual humility strikes me as the everyday equivalent of the scientific method.  You make the best call you can at a given moment, knowing full well that new information may come along later that will change your view.  Keeping an eye open for that kind of information makes it likelier that you’ll avoid barreling headfirst into an iceberg.

But intellectual humility is often an awkward fit, at best, with the styles of leadership to which many people respond.

They respond to tub-thumping certainty.  They like clear, simple, confident rallying cries.  They perceive changing positions -- if they notice -- as a sign of corruption, hypocrisy, or weakness.  They want answers, and they identify people with the answers they give.  

In other words, a certain kind of follower rewards either dishonesty or shallowness in a leader.  The very trait likely to lead to better decisions can carry a direct political cost.

Some leaders lack intellectual humility altogether, so for them, the conflict is external.  They keep wondering why the world frustrates them.  You can spot them by their remarkable lack of self-awareness.

Some, like the younger George Wallace, consciously choose closed-mindedness specifically because of its political payoff.  When the political math changes, you can always declare that you suddenly see the light.

Others resolve the tension through charisma and/or patronage.  If you’re likeable enough, you may be able to charm your way through some strategic pivots.  I think of that as the Reagan strategy, named after its master practitioner.  (If you prefer, you could say something similar of Bill Clinton.)  If you can charm or buy your way out of the political downsides of shifting positions, then you can respond to the world as it changes.  Nixon can go to China.

The ones I respect, and try to emulate, are the ones who split the difference between means and ends.  Moral positions can be strongly held and effectively irrefutable.  Methods of achieving those ends are contextual, and therefore subject to change.  In the context of community colleges, for instance, I see broad access, high quality, and a commitment to equity as non-negotiable.  If you don’t embrace those, you shouldn’t work here.  But the ways of bringing those to fruition are subject to change, whether by external context or by conscious experimentation.

The experimental ethic can be a difficult sell.  Too many people, when presented with a “what if…” scenario, immediately default to the need for absolute certainty.  It’s a fear-based response, and a potentially deadly one. And some demagogues will consciously stoke that fear for their own purposes. The challenge of thoughtful leadership is getting people past that.

It’s just hard to split differences when people are scared and looking for certainty.

Wise and worldly readers, have you seen that move done well?  Alternately, is there a more effective way to rally the troops while maintaining intellectual humility?