Last year, Senator Tom Coburn published a report entitled "Under the Microscope," in which he criticized the National Science Foundation's (NSF) choices to fund any research he couldn't immediately understand as important. Coburn's report stands out for its willful ignorance. It caricatures research in a way only possible for someone unwilling to read a single page of the science he attacks. The report is also boring: knowledge-phobic descriptions of supposedly useless (but actually important) science are old among politicians, and are becoming repetitive as a growing sector of the American public develops a terrifying aversion to scientific evidence. Coburn in this case is playing the cheapest possible note to a polarized crowd.
Coburn's report is wrong and infuriating: that much has been discussed by others elsewhere. What I think is woefully missing from the conversation is what scientists should do about it. The consensus as I've experienced as a researcher is that (1) ignorant political attacks will not affect our ability to get work done, (2) that it is not our job to help the public understand our work, and that such outreach is unscientific, because it requires us to mischaracterize research. I think both claims are wrong, and potentially dangerous to the future of science.
First, political attacks now more than ever present a huge risk to science. It's true that Coburn follows a long line of politicians passing judgment over research they refuse to comprehend: Senator William Proxmire's similarly uninformed Golden Fleece award dates back to the '70s. Over the decades, political criticism has annoyed scientists, but rarely affected their work or funding; researchers often take this history to mean that such attacks should be shrugged off instead of worried over. But these are different times: the last decade has seen a tidal shift towards regressive anti-scientism. Belief in evolution is a strong litmus test for the public's understanding of science, and the percentage of Americans who roundly reject its premise (instead claiming that God created humans in our present form) is higher now (51%) than it was in 1982 (44%). In fact, rejection of science is becoming a surreal badge of authenticity among politicians such as Michelle Bachmann and Rick Perry.
All this makes for a dangerous time for scientific progress, and particularly for behavioral and social sciences (in essence, the "human sciences" that focus on people's experience and actions) including my own field, psychology. This is because human sciences focus on topics -- social networks, emotion, memory, decision-making, race relations -- that, to the lay public, sound less scientific than cellular structure or electromagnetic force. People often feel as though they understand their minds (but not physics) already, and that the study of people and cultures can't tell them anything new. Although this belief couldn't be farther from the truth (more on this later), it is a real and frightening risk to the human sciences. Indeed, following his report, Coburn proposed completely eliminating NSF's funding for human sciences, citing just this type of thinking: "...do any of these social studies represent obvious national priorities that deserve a cut of the same pie as astronomy, biology, chemistry, earth science, physics, and oceanography?" This opinion was echoed by Mo Brooks, the chair of a Congressional panel considering such cuts, who explicitly claimed that the human sciences have yet to prove their worth.
This perception could hardly miss the mark more. The human sciences' rigorous study of cognition and behavior often produces results that run completely counter to most people's intuitions. In fact, a broad message emerging from the last 50 years of psychological research is that many of our most critical mental operations--moral judgments, preferences, and the like--are not only driven by forces outside of our awareness, but also that we stubbornly refuse to acknowledge these forces. Instead, we come up with stories explaining our behaviors, leaving their true sources hiding à la Donald Rumsfeld's unknown unknowns.
Nor is this insight purely academic. The human sciences' findings about the sources of our behavior can overwrite useless and often damaging assumptions and change policy. Here are two brief examples:
1. Insights from psychology can reform social programs: Oftentimes, social programs labor under misguided intuitions about the psychological sources of healthy behavior. Consider conformity, which has gotten a bad rap for over a century. The party line is that conformists are weak, and that their combined lack of backbone leads to everything from witch-hunts to financial bubbles to teenage smoking. Groups such as D.A.R.E. (Drug Abuse Resistance Education), the long-running lesson program led by police officers, emphasizes that resisting peer pressure is critical to living drug- and violence-free lives. Youth voting campaigns have also appropriated this intuition, and encourage their audiences to buck the sorry trends set by their peers. These strategies frame healthy behavior as an individualistic step away from the crowd -- and they rarely work. Instead, research by social psychologist Robert Cialdini, political scientist Alan Gerber, and others shows that a more efficient strategy is to frame positive behaviors such as voting and responsible use of energy as something that others are doing, and harness the power of conformity to encourage such behaviors. This insight suggests critical changes to several large-scale programs. D.A.R.E., for example, receives huge amounts of government funding, and--more importantly--has reached tens of millions of children in the U.S. alone, despite little evidence that it does any good, and some evidence that it does some harm. Simple changes inspired by the human sciences could vastly improve the efficacy of such programs.
2. Behavioral research can improve education: Motivating children is among educators' most important jobs. Our culture approaches this job through the intuition that behavior is best motivated through reinforcement. We pay people for work, give children prizes for high test scores, and honor charitable donors, with the assumption being that these external validations will make people try harder and enjoy their work more. Although most of us wouldn't want to stop being paid, my colleagues Mark Lepper and Carol Dweck at Stanford describe ways in which particular forms of praise can backfire. Lepper showed that praise in the form of rewards can "overjustify" otherwise enjoyable activities: if I like math and you pay me for doing it, I will eventually conclude (perhaps implicitly) that I am only doing it for external rewards, and as a result will enjoy it less. Dweck showed that certain forms of praise can further induce a problematic "fixed mindset:" the idea that intelligence is fixed at birth rather than something that can be developed. If you tell me that I am good at math, I may start to believe that aptitude at math is a stable trait, and that my innate ability means that math will always come easily to me. When I face new challenges -- say, moving from arithmetic to algebra -- I might read my initial difficulties as a threatening sign about my innate abilities. Instead of piquing my interests, more difficult work may cause me to decide that I am no longer a "math person," and give up on the subject. Dweck has developed simple methods for encouraging people to adopt healthier ways of thinking, for example by praising children for effort, rather than skill or ability, which produce clear benefits in children's' long term motivation to learn, but they must be understood in order to be effectively put into use. Although she and others understand the potential value of such human-science inspired policy, the vast majority of our educational system has yet to absorb any of these insights.
Part of the fault lies with us, the researchers. Psychologists have told me time and again that it is not our job to communicate the relevance of our work to the public, and that such communication requires watering down our findings or dangerously prioritizing pop appeal over deeper truth-seeking. I disagree. Such communication is our job, inasmuch as professionals hoping to be funded by the public should be able to broadly explain the importance of their work. This is even more vital for the human sciences, which run an especially high risk of being misunderstood.
More importantly, human sciences, in many cases, require the public's understanding before they can actually change our lives for the better. People do not need to believe in propulsion physics for NASA to launch shuttles; people do not need to understand drug action for their medicine to be effective. But the human sciences' best bet at improving lives comes from changing people's behavior, and for that to happen those people have to understand the importance and clarity of these fields' findings. To move beyond ignorant, harmful dismissals of evidence like Rick Perry's failed abstinence education programs, and replace them with workable ways to improve health and society, the public must first believe that psychology and its sister fields can provide such improvements. Stoking that belief is among our most important work as researchers in the human sciences.