This essay aims at introducing current scientific standpoints on the ‘fake news” phenomenon through recent research and at critically evaluating these studies by raising questions and giving further considerations on the subject. At first, I intend to give broader insights into the actual trends of investigating fake news, then a few recent studies will be presented briefly, followed by the evaluation and discussion.
Today we live in the world of “fake news”, or in the so called “post-truth era”. It is not a new concern, but only the wide-spread use of social media was able to put it on the global stage, and as such, in the spotlight of media and science (Jamieson, 2015). For further understanding, we should clarify the meaning of fake news. According to the definition of Cambridge dictionary, fake news is basically “false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke”. It has particular importance when it comes to the research of the topic, since the veracity of statements is sometimes questionable or debatable.
As a form of communication, we can approach the fake news problem from different viewpoints. This paper concentrates on the recipient’s side, but expletively I shortly introduce a few key points related to the message aspect. Concerning the presentation of a content, the way we are served with information has a significant impact on our susceptibility (e.g. Do we deal with terrorism in a “war” or by “law enforcement”?), so framing has a crucial role (Lakoff, 2010). Fazio, Brashier, Payne and Marsh (2015) further narrowed it down, and they found that repetition clearly matters, and besides, possibility of fluent processing – in other words when a message is fairly easy to process – frequently leads to the neglect of stored knowledge, regardless of cognitive abilities. Therefore, our belief is not only up to us, even a clearer font style could influence our beliefs (Fazio et al., 2015).
Before we dive into the psychological and cognitive aspects of fake news susceptibility, it is needed to take a glance at the bigger picture. Naturally, the symptoms of the “post-truth era” partially derived from broader social processes. According to Lewandowsky, Ecker and Cook (2017) there are trends which might underlie the spread of fake news. Firstly, (1) decline in social capital and shifting values is a decadal supporting trend of the phenomenon. It refers to a downturn in factors such as trust in people and in institutions, empathy and civic engagement. The second and the third trend mentioned by the authors are going hand in hand: (2) growing financial inequality which predicts (3) political polarization quite accurately (see also: Andersen and Curtis, 2012). These are self-evident expressions on a high level, but become very complex quickly when getting deeper, and for the concerns of our topic (as it is enough to come clear with the higher-level meanings), we will not elaborate them here. The fourth trend is (4) declining trust in science. There is controversial evidence if there is a politically driven asymmetric polarization in terms of trust in science, but the decline is undebatable (Vraga, 2017). Finally, it is unavoidable to mention another global trend, which is the most obvious: (5) evolution of the media landscape, which refers to the radical changes in media consumption and usage (Lewandowsky et al., 2017). These points might be broadly evaluated as by-products – or at least correlates – of the fast-paced individualization, social isolation and technological development in the Western world.
Several political and psychological studies tend to favor ideological explanations as the main mediating factor related to susceptibility of fake news (e.g. Kahan, 2017; Garrett, Weeks & Neo, 2016; Hibbing, Smith & Alford, 2014), however findings are still controversial. Not to mention that being liberal or conservative – since this dichotomy appears in most research – is a complex and high-level trait, which is, in certain cases, not even constant. Here, I present a few researches which argue that other factors play the main role.
Guess, Nagler and Tucker’s research (2019) focused on socio-demographic correlates of fake news susceptibility. They analyzed 3500 persons’ Facebook history during the U.S. elections of 2016. These participants were also asked to fill out an online survey about their socio-demographic information and ideological orientation. The results suggest three main findings. First, sharing fake news is a very rare activity amongst the population – more than 90% of the sample shared no stories at all from fake news domains. Secondly, conservatives are more likely to share articles from fake news domains. This finding is in line with earlier results (e.g. Kahan, 2017; Garrett, Weeks & Neo, 2016; Hibbing, Smith & Alford, 2014), however the effect shown was not as outstanding as in other researches. Eventually, the most robust finding of the study states that fake news sharing is the most common in the 65+ age group, twice as many as in the second oldest group (45-64), and seven times more than in the youngest age group (19-28). Indeed, seniority may go along with other traits (such as being more conservative), but the multivariate analysis showed significance differences too (Guess et al., 2019). In another study, Pennycook and Rand (2018) also argue that motivated reasoning (e.g. ideological explanations) are not the main factor behind the susceptibility of fake news. They used the Cognitive Reflection Test to measure the participants (N=3446) engagement in analytical reasoning. The findings show that the susceptibility to misinformation is better explained by the lack of reasoning than by motivated reasoning. They concluded that analytical thinking is used to evaluate the truth content of a news headline regardless if the readers’ opinion was consistent with the political ideologue or not. Therefore, they state that it is “lazy” thinking which stands behind susceptibility, not the so called “partisan bias” – which basically means ideologically motivated reasoning (Pennycook and Rand, 2018). De keersmaecker and Roets (2017) investigated the topic from a different point of view: they studied what happens when we face dissonance between reality and news we accept as true. The research supposed that most people adjust their belief when they are proved to be wrong on an issue. Cognitive ability was examined using a specific subscale of WAIS, and it was found as the main mediating factor: individuals with lower cognitive ability adjusted their opinion less after their belief had been proven wrong than people with higher cognitive ability. This seemed to stand regardless of the degree of need for closure or authoritarianism which factors were measured as well. As a conclusion, authors state that the effect of incorrect information cannot be simply withdrawn by showing the falsity of it (De keersmaecker and Roets, 2017).
Barfar (2019) intended to map cognitive and affective responses to political disinformation in Facebook. He conducted text analysis on cca. 2100 political posts and evaluated the comments. The findings show that comments on political disinformation included significantly less analytic responses than on true news. The results also tell that incivility and anger appears more frequently under posts containing radical political misinformation, and amongst the extremes, liberals showed more of it. However, greater anxiety was found to be present under true news. The author concludes that certain behavior styles cannot be strictly arrogated to groups based on ideologies (Barfar, 2019).
An extensive study (Bronstein, Pennycook, Bear, Rand and Cannon, 2018) also aimed at investigating correlations of belief in fake news. The research included the measurement of predisposition to delusion, cognitive styles, dogmatism and religious fundamentalism as pen and paper tests and examined the susceptibility of fake news experimentally. The results show that delusion-prone individuals, dogmatic people and religious fundamentalists are particularly exposed for deceit by fake news. A mediation analysis was conducted as well, and it showed that the above mentions correlations are partially or fully explained by the level of engagement in analytical and actively open-minded thinking. Therefore, they suggest that delusion-proneness, dogmatism, religious fundamentalism and susceptibility to fake news might have common segments in regards of their origins: low levels of analytical and actively open-minded thinking. Accordingly, they suppose that training in these domains may lead to reduced exposure to fake news in consonance with the principle of dealing with the core of the problem to solve it (Bronstein et al., 2018).
Critics and discussion
The above-mentioned studies were conducted in the United States or in Western Europe, and as such, the findings mostly represent the WEIRD countries, which may not be considered a problem as it aligns with the nature of the topic as well. Also, the studies frequently concentrate on sharing activity regarding social media and fake news susceptibility. It might be a good predictor, although it does not necessarily provide information about the individuals who believe in certain misinformation but do not share it. Another point for further discussion could be the fake news examples presented to the participants in several studies. Former analysis of the material should be conducted beforehand (as in Pennycook’s and Rand’s research (2018)), or political scientists should be involved in the research, otherwise there is a chance that the choices for presented materials might be arbitrary. Related to this, it would be advantageous for further studies to universally define the concept of “fake news”, and constrict the idea to factual falsities (e.g. do not use complex, thus debatable ideas or opinions and interpretations related to those ideas as examples of fake news in research), or in a nutshell: separate it from politics and ideologies. First, it may reduce the possibility that the studies will be used for political purposes later, and secondly, the censorship of the media might be based on scientific results one day.
Besides all the critics, analytical thinking (or the lack of it) may play an essential role in susceptibility of fake news (see Bronstein et al., 2018; Pennycook and Rand, 2018). It is apparently a strength of these studies that the research concentrated on lower-level traits, which may explain the higher-level ones.
Clearly, the fake news problem is apparently here calling out for solution, mainly because of its role in societal and political decision-making, but studies rarely mention the other (and maybe more positive) side of social media and fast spreading of data: the huge amount of available information, including news, led to a situation that the public was never as well-informed world-wide as we are today. It might have a balancing effect on societal decisions, such as votes, but it would be another essay topic. However, it implies that if we were able to measure the effect of fake news on certain societal decisions precisely, it would serve us information about the severity of the issue. Today, we only have unfounded guesses about the extent of this effect.
Finally, further and broader investigation about the fake news susceptibility amongst digital natives and digital immigrants may also be taken into consideration, as it may provide information about the possible future trends in the misinformation problem. As a study (Guess et al., 2019) showed, age matters when it comes to susceptibility.
It is obvious that we cannot draw a clear conclusion or suggest answers for the emerging questions of fake news today, however it seems highly possible that broader and higher education could be a protective factor amongst other lower-level endeavors such as providing education specifically on misinformation, training analytical and actively open-minded thinking or honor certain news sources with some kind of “verified news” certificate.
Barfar, A. (2019). Cognitive and Affective Responses to Political Disinformation in Facebook. Computers in Human. Accepted Manuscript.
Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., Massachusetts Institute of Technology, Cannon, D. T. (2019). Belief in Fake News is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced Analytic Thinking. Journal of Applied Research in Memory and Cognition. 8(1). 108-117.
De keersmaecker, J., Roets, A. (2017). ‘Fake news’: Incorrect, but hard to correct. The role of cognitive ability on the impact of false information on social impressions. Intelligence. 65. 107-110.
Fazio, L. K, Brashier, N. M., Payne, B. K., Marsh, E. J. (2015). Knowledge Does Not Protect Against Illusory Truth. Journal of Experimental Psychology: General. Vol. 144. No. 5. 993–1002.
Garrett, R. K., Weeks, B. E., & Neo, R. L. (2016). Driving a wedge between evidence and beliefs: How online ideological news exposure promotes political misperceptions. Journal of Computer-Mediated Communication. 21. 331–348.
Guess, A., Nagler, J., Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances. 5.
Hibbing, J. R., Smith, K. B., & Alford, J. R. (2014). Differences innegativity bias underlie variations in political ideology. Behavioral and Brain Sciences. 37. 297–350.
Jamieson, K. H. (2015). Implications of the demise of “fact” in politi-cal discourse. Proceedings of the American Philosophical Society. 159(1). 66–84.
Kahan, D. M. (2017). Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN Electronic Journal.
Lakoff, G. (2010). Why it matters how we frame the environment. Environmental Communication: A Journal of Nature and Culture. 4. 70–81.
Lewandowsky, S., Ecker, U. K., Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth”. Psychological Science in the Public Interest. 13. 106–131.
Pennycook, G., & Rand, D. G. (2018). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition. (188). 39-50.
Vraga, E. K., Bode, L. (2017). Leveraging Institutions, Educators, and Networks to Correct Misinformation: A Commentary on Lewandowsky, Ecker, and Cook. Journal of Applied Research in Memory and Cognition. (6). 382–388.