As science advances, bad science does, too. The World Economic Forum has called widespread digital misinformation -- which largely spreads via social media -- one of the main threats to our society.
While online resources make it easier for people to learn about incredible scientific discoveries, social media also facilitates the proliferation of pseudoscience, scientific skepticism and conspiracy theories.
Now, scientists are coming to a better understanding of exactly how this digital information flow works. Research published this week in the journal Proceedings of the National Academy of Sciences maps out the factors that influence the spread of scientific misinformation and skepticism within online social networks -- and the findings were disturbing.
"Our analysis shows that users mostly tend to select content according to a specific narrative and to ignore the rest," Dr. Walter Quattrociocchi, a computer scientist at the IMT Institute for Advanced Studies in Italy and one of the study's authors, told The Huffington Post in an email. Users are driven to content based on the brain's natural confirmation bias -- the tendency to seek information that reinforces pre-existing beliefs -- which leads to the formation of "echo chambers," he said.
In other words, people on social networks such as Facebook or Twitter connect with others over common interests. They "friend" and follow those with similar values and beliefs and filter out anyone who disagrees with them.
This behavior builds "echo chambers," where people expose themselves only to beliefs and messages that reinforce their own -- which can lead to things like climate change denial in conservative circles.
"The conservative echo chamber -- Fox News, talk radio, conservative columnists and bloggers -- combine to create a 'bubble' in which many committed Republicans live, and when it comes to scientific issues we find that they literally create an 'alternative reality' in which human-caused climate change is a hoax," Dr. Riley Dunlap, an environmental sociologist at Oklahoma State University, told The Huffington Post in November. "The problem is that this conservative worldview is deeply at odds with empirical reality."
The problem is really difficult to solve."
For the study, the researchers conducted a quantitative analysis of articles shared on Facebook related to either conspiracy theories or fact-based science news. They found that users tended to cluster within homogenous, polarized groups, and within those groups, to share the same types of content, perpetuating the circulation of similar ideas.
The researchers explain that this phenomenon is driven by the brain's confirmation bias, which is the natural tendency to seek out and interpret information in a way that confirms our pre-existing beliefs. Social media, which allows us to carefully curate our news exposure, makes it easier than ever to indulge this bias.
"Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization," the study's authors write. "This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia."
Hopefully, researchers will be able to use this understanding to devise better guidelines for scientists and the media to reach the public about important scientific issues, Robert Brulle, a sociologist at Drexel University who has studied climate change denial, told The Washington Post.
“[Scientists and communicators] really need to take this kind of bifurcation of their audiences seriously,” he said. “Continued preaching to the choir is not going to work.”
What is going to work? Unfortunately, there's no simple solution.
The study also found that the most engaged social media users tend to have higher numbers of friends who share the same interests (and presumably few who don't).
"Under these settings, the problem is really difficult to solve," Qattrociccio said.
Also on HuffPost: