We are bombarded daily with more information than ever before in human history. Even our youngest have 24/7 access to a world of knowledge. We make information a part of our culture and our everyday lives. Especially in the age of the Internet, it is important to understand how people seek information and how they process it once they get it. For decades, psychologists have considered different models that attempt to describe the seeking and processing of information, and how this affects critical thought. One of these models, The Heuristic-Systematic Model (HSM) described by communications researcher Shelly Chaiken in the 1980’s, views human information processing as a dual-process. We can make quick, gut feeling, and often biased decisions about information or we can critically evaluate them, looking for evidence and weighing the alternatives. Human information processing is then broken into heuristic and systematic seeking/processing. [You may recognize this dual-process as what Daniel Kahneman describes as system 1 and system 2 thinking.]

As was discussed in Part 1 of this article, the HSM has been widely successful in describing human cognition, especially when researching media consumption. The model works on a basis of cognitive economy (the more effort you are willing to expend, the more deeply you are predicted to process the information, and vice versa) and information sufficiency (how much information you think you need on a topic will predict whether you look closely at the information or not), each predicting what style of thinking a person will engage in.

Out of this model also comes a number of heuristics, or cognitive short cuts, the appearance of which allows us to make quick judgments on information. For example, an article that has a number of scientific references may lead you to consider the information as credible, without checking the references yourself. The critical thinker cannot escape cognitive heuristics, as they are ever-present, but they can be recognized, corrected for, and even harnessed.

For critical thinking to take place, there is a constant tension between how much effort you are willing to give and how much information you think you need about something. [Keep in mind that you can critically think about something and still be wrong.] This depth of processing is what is described by the HSM. But for a complete view of dual processing, two more factors complicate things: motivation and capacity.


The HSM outlines three major motivations for information seeking and processing, each of which predict outcomes for whether you will critically think about a topic or not:

·      Accuracy motivation: This is a motivation to obtain accurate and valid information in order to make a confident judgment about a topic. As relating to information processing, when the accuracy motivation and cognitive resources (effort, capacity, etc.) are low, heuristic processing may be seen as the best way to satisfy accuracy goals. Additionally, when the accuracy motivation and cognitive resources are high, systematic processing is used and tends to instill a greater judgmental confidence.

·      Defense motivation: This motivation is a desire to make judgments that are in accordance with one’s material interests or identity-entangled beliefs. These “self-definitional” beliefs are those that are closely tied to the self, one’s values, identity, and attributes. Defense motivation affects information processing by creating selectivity among both heuristic and systematic measures. When the defense motivation is low, encouraging heuristic processing, people will selectively choose which heuristics are congenial to their own beliefs, and discard or ignore the ones that are not. Also, when defense motivation and cognitive resources are high, even the typically in-depth and thorough nature of systematic processing becomes biased. Systematic processing in a defensive motivation seeks reinforcement and not necessarily truthful information. In fact, contrasting information may be systematically scrutinized in an effort to tarnish its validity.

·      Impression motivation: This motivation is a desire to form judgments that will satisfy current social goals and is dependent upon perceived interpersonal consequences of expressing a particular judgment in a social context. Like defense motivation, it also leads to selective processing towards relevant, in this case social, goals.

All of these motivations operate in the model similar to information sufficiency; a greater motivation will encourage systematic processing and a lesser motivation will encourage heuristic processing (e.g., the more motivated you are to obtain accurate information, the more likely you are to systematically go about it).

Accuracy motivation is pretty straightforward, as most of us have tried to find good evidence for a claim before. But for the skeptic, and for the promotion of science and critical thinking, understanding the defense motivation I think is vital for strategy’s sake. Like the well-known psychological misstep, confirmation bias, the defense motivation is a great obstacle to promoting science and debunking pseudoscience.

Based on the HSM, I can propose a bit of technique for science advocacy that may spill over into the debate that so often encompasses the “New Atheist” movement and other skeptical rhetoric. When arguing a claim, if that argument is posed in a way that will instill a defensive motivation it is likely to bias information processing. For example, imagine that you have been told that your own religious belief is silly and nonsensical. You then head over to the Internet to find out about this supposedly fallacious belief. The HSM predicts that the type of information that you will seek is the information that supports your belief against this attack. Whether or not the information is truthful, reinforcement is sought for your besieged belief. Further hindering things, any information that does not reinforce your belief will be scrutinized for no other reason than that it disagrees. This is not so much a knock on the New Atheists and the like, whom I think logically lay out arguments against the supernatural, as it is against oppositional rhetoric. From an information-processing standpoint, creating a defensive motivation does more harm than good. (It should be noted that the predictions of the HSM are correlations and not hard rules; a person with a challenged belief may still seek information in an unbiased way.)

The answer to this problem would then be to instill a different type of motivation. Religious beliefs may still be challenged, for example, but in a way that promotes an accuracy motivation. If a person is motivated to seek truthful information in a systematic way, instead of being motivated to support his or her own beliefs with purposefully selected information, the message has a greater chance of getting through. Like the backfire effect described in the psychological literature, it would be more important to emphasize the truthful information or the seeking of truthful information rather than emphasizing the falsity of some claim. For instance, explaining that there is no scientific evidence for the notion of “intelligent design” would be better than simply saying believing in “intelligent design” is ridiculous. Explaining the science to a true believer of any sort rather than explaining it in relation to its opposition to some belief is the goal.


The final factor in the Heuristic-Systematic Model is information capacity. This means that the more that you feel you can deal with the information on a topic, the more deeply you can process it, and vice versa. For example, if you want to look up accurate information about antioxidant supplements but feel like you don’t know where to look, what experts to ask, what constitutes good evidence, etc., you are likely to heuristically process any information you find, because you do not have the capacity to process it deeply.

This obviously relates to the promotion of science literacy. The greater the capacity that people have to process sometimes technical information, the more likely they are to critically think about it. However, this does not equate to throwing more information at the public, as the somewhat defunct deficit model of science communication predicts (more information=more understanding). It means teaching the process of science, how to think scientifically, and giving the public the resources to interpret technical information themselves. In short, feeling confident that you can interpret science for yourself allows for critical thinking to occur.

Thinking About Thinking

Information processing is complicated, but the main ideas are easy to understand. Information sufficiency, capacity, and motivation serve as the main determinants of critical thinking as described by one of the most successful models of how people handle information. In a sort of reductionist mindset, how people seek and process information is distilled into a combination of these factors.

For the critical thinker, understanding information processing is just another weapon in the armamentarium against credulous thought. While knowledge of the biases and psychology of pseudoscience is now common in the skeptical community (confirmation bias, the backfire effect, wishful thinking, etc.), I think that knowledge of information processing is just as important, and should be elevated to a similar familiarity.


Kyle Hill is the JREF research fellow specializing in communication research and human information processing. He writes daily at the Science-Based Life blog and you can follow him on Twitter here.