By David Mills
The political arena isn’t the only place where “fake news” is being debated.
Scientists are now speaking out about false information and “alternative facts” that they say are diluting and harming legitimate research.
To be sure, there has always been phony scientific material from snake oil salesmen to industry-sponsored research to tabloid headlines.
However, experts interviewed by Healthline say the advent of the internet and the popularity of social media have made it easier for fraudulent information to spread.
The topic is worrisome enough that the American Association for the Advancement of Science (AAAS) made it part of their agenda at their annual meeting last weekend in Boston.
“The new media environment has allowed this type of information to be disseminated,” Dominique Brossard, PhD, a life sciences communication professor at the University of Wisconsin-Madison who spoke at the conference, told Healthline.
Types of bad information
There are several packages in which false information is delivered.
Some are simply outright lies promulgated by people with an agenda.
Others are part of research funded by industry to skew results and opinions.
And yet another segment is questionable research that receives widespread attention.
Some are a combination of these factors.
One of the best examples from the past is the tobacco industry, which for decades funded research that cast doubt on whether cigarette smoking and secondhand smoke were bad for your health.
The sugar industry has also been accused of propping up studies that downplayed the health hazards of consuming sweetened foods and beverages.
“All an industry needs to do is create some uncertainty,” Kevin Elliott, PhD, an associate professor at Michigan State University, who also spoke at the AAAS meeting, told Healthline.
Sometimes the tainted research isn’t easy to spot.
Earlier this month, a study was released that concluded lung inflammation was far less serious in e-cigarette smokers than it was in people who smoked regular cigarettes.
The research was funded by British American Tobacco.
Why would this organization fund a study that had negative results for regular cigarettes? Turns out the tobacco industry worldwide is getting into the e-cigarette market.
Faulty research can also receive widespread attention.
In 1998, a British doctor named Andrew Wakefield published a study in The Lancet that linked autism to the measles, mumps, and rubella (MMR) vaccine.
However, that study only included a small sample size of 12 individuals, and a number of conflicts were eventually uncovered involving Wakefield and his colleagues.
The Lancet retracted the study in 2010, but it still is quoted by some anti-vaccination organizations.
In September 2012, a study was widely publicized that linked genetically modified corn and the herbicide Roundup to tumor growth.
The study was retracted in 2013 but then republished in another journal in 2014.
Brossard said these types of studies have led to the creation of a blog called Retraction Watch.
She said the online column reports on 500 to 600 retractions a year.
Spreading the word
The problem isn’t just the questionable research.
It’s also how quickly and widely the information can spread.
Elliott and Brossard note that anyone can have a website in today’s world, and even former Playboy playmates like Jenny McCarthy can become experts on vaccines and autism.
On those sites, people can post and share whatever material they deem to be worthy and accurate.
In addition, sites such as Facebook can add to the problem.
Those social media sites track what information a person is interested in and feeds them more of the same. So, someone who thinks the coal industry doesn’t pollute the air will see more material along those same lines.
Laura Boxley, PhD, director of clinical neuropsychology training, and assistant professor-clinical in the Departments of Psychiatry and Behavioral Health, Neurology, and Psychology at The Ohio State University Wexner Medical Center, said this type of information can be more appealing to those reading it than accurate information.
“Real science is not sexy and fancy. It’s slow and steady,” she told Healthline.
This “confirmation bias” can produce and validate a person’s one-sided view.
“There is a lot of danger in accepting only one scientist’s opinion,” said Elliott.
Beyond hardening an individual’s belief, scientific “fake news” can also affect government policy.
Climate change is one high-profile example with a new president who, in the past, has proclaimed that the scientifically proven phenomenon is really just a “hoax.”
“The consequences are important,” said Brossard.
“Alternative facts in science,” added Elliott, “facilitate alternate facts in politics.”
What can be done?
Experts are urging several courses of action to stop or slow the spread of false scientific information.
First, they say scientists need to do a better job of communicating their research to the public.
Spouting off data and technical terms isn’t going to get it done.
They add society should start teaching teenagers in middle school and high school how real science works. That way they’ll be able to spot phony research when they’re adults.
“Teaching this early builds lifelong skills,” said Boxley.
“This highlights the importance of developing a sophisticated citizenry,” added Elliott.
The experts also urge the country to better fund and better respect places where real scientific work is done.
“We need to double down on our institutions,” said Boxley.
Finally, they urge the public to avoid the temptation to share dubious information on social media.
“It’s really hard to break that echo chamber,” said Brossard.
In addition, she said, search engines such as Google should eliminate research that has been debunked from its system.
She pointed out that Wakefield’s vaccine study can still be called up.
Institutions, she added, can also monitor the internet and then perform “damage control” if they see incorrect information out there.