Reading the Daily Me

Nicholas Kristof of the New York Times has written an excellent column, "The Daily Me." I encourage you all to read it here: Registration may be required.

In it he shows that people generally only want information that agrees with their preconceived viewpoint. Opposing viewpoints are welcome only if silly and easily dismissed. But harder to welcome are facts and conclusions that would make us rethink what we believe. Even if something is fact, belief is not welcome if it would endanger our belief structure.

The result, of course, is polarization. And boy oh boy, are we polarized today! We just can't stand others with opinions that are different, and well, if someone brings out a real fact that doesn't fit the way we see things .... It is no wonder that we have people calling for violent revolution, for "Second-Amendment remedies", to fight against domestic "enemies". We are so inflamed that we hardly see our neighbors who disagree with us politically as people any more. Dangerous times.

Does the "Daily Me" afflict Huffpost and its viewers, too? Absolutely. No one is immune to it. Editors pick and choose articles based on the direction they want the site to go -- as is their right. But too often the perspectives chosen to publish on a single topic are all the same. It is human nature -- as human as our only choosing to read articles agreeing with our perspective.

Kristoff's article is brilliant. Please read it.

But then, now what? Kristoff recommends us "to work out intellectually with sparring partners whose views we deplore." I don't think that's enough. Lots of people like to fight, and get yourself a good sparring partner and you can get a good fight without anyone being the wiser.

Why do I feel I have something to say about this? Well, as a person who has traveled the road from ultra-conservative to a moderate liberal (my viewpoint of myself), I could see the Daily Me at work. I have seen some people change while others do not. I think there are reasons why a person could be captive to the Daily Me, while others just might wind up growing.

We all want to assume the rightness of our own opinions -- doesn't everyone? And so we look for information that verifies what we think. The "Daily Me" reassures us that we are right and don't need to change. But conflicting information tells us that change is necessary and right. It is a lot more bothersome. We get irritated. It means we have to think more of others, less of ourselves, and be more interested in acquiring truth than in being right.

So, how do we avoid the Daily Me? How do we go about actually desiring the truth?

Question 1: Are we willing to learn we are wrong about a particular issue or position?

If the answer is "no", then we partake of the Daily Me. If the answer is "yes", then we have to look for other opinions. Liberals -- go to Fox News! Conservatives -- go to MSNBC and NPR and HuffingtonPost! Push down the gag reflex and do it.

Question 2: Given information from a different perspective than your own, do we consider the source a nut job, or do we understand that the person has a reason for their perspective and likely sees us as unreasonable?

If we only see those who differ from us as "nut jobs" then we might be the unreasonable ones. After all, "unreasonable" is how we view (other) people who won't consider another opinion. But if we recognize that experience and reactions to the unknown guide everyone to their opinion (even ourselves), then we just might be a little more gracious.

Question 3: Someone tells a verifiable falsehood. Yet even when presented with evidence to the contrary, they will not change their mind. Is it because they are evil people who love lies, or is it because their perspective has them focusing on a portion of the issue that we might not be looking at?

Without question, there are some evil people out there. However, most people -- even the talking heads on TV -- parrot what they have heard. And what they have heard is probably incomplete information about a subject they have even more incomplete information on. Thus it is easy to focus on a part of an issue that seems out of whack.

An example would be the "death panels" in the health care bill. The death panels -- not so named, mind you! -- are consultations with patients and family about end-of-life care. People who have degenerative diseases like cancer or AIDS or Alzheimer's need this, because at a certain point, no matter how you treat the disease, the patient will come to a place where they can no longer reason or communicate well, and will need someone to make decisions for them. Living wills are one way this is done. No one is ordered to die. No one is executed. No one's grandmother is going to have the Gestapo pull the plug on them.

But someone called them "death panels" and a wild hysteria gripped certain people. Already dealing with the idea that government was intruding into health care and would be making some decisions, they connected end-of-life care with a kind of end-of-life order, reminiscent of the murder of Jews in Nazi Germany. To them, their fears were reasonable, even if they were not correct in their assessment.

Those who knew the facts could either call the ones who were afraid "nut jobs" or they could understand why they were afraid and try to give countervailing information in a way that others could understand and accept. The second option is much more difficult, and means that one is willing to do some work to reach the core of rationality in others. Yes, we have to believe that such a core of rationality actually exists!

Question 4: Suppose we find we have been mistaken about a point. Do we refuse to admit it because pride won't let us, and the others are really wrong anyway? Or do we admit the point and do the work to revise our opinions with new, more accurate information?

Remember, this is not just information that generally agrees with what we already believe. This is about information that might make a structural difference. This is the scary truth contradicting a key feature we have invested emotional knowledge in.

This is a toughie. There are people who believe that the conclusion is already reached and correct, so data that interferes with that must automatically be wrong. And pride is a big issue. On some issues, people may even believe they are staking heaven and hell on the answers. In that case, fear of losing salvation, or fear of finding out they were never saved to begin with, or fear of incurring God's wrath -- something along these lines -- will keep people from considering anything not held in their particular framework.

Pride and fear and fear of consequences are what keep people locked into error. Even religious people. People who aren't religious have their own pride issues, but usually consequences are in terms of the immediate, not the eternal.

If we are willing to change our mind and deal with new information appropriately, we have a chance to avoid the Daily Me. We are willing to learn, not just to assume we know it all. But we have to be careful here! We often see ourselves as reasonable and others as unreasonable when we may be just as stubborn and just as wrong. Religious people often see "lost" or unreligious people as being unwilling to admit they are wrong. But viewed from the position of the unreligious, they are just as unwilling to admit any error.

To avoid the Daily Me, we have to be willing to learn we can be wrong, to find out where we are wrong, to be charitable with those who we disagree with, and be willing to change our own structure of knowledge to accommodate truth when we find it. None of these are easy. All of them are painful. But what we do in these situations tells Who we Are.