Logo for College of DuPage Digital Press

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 Module 7: Thinking, Reasoning, and Problem-Solving

This module is about how a solid working knowledge of psychological principles can help you to think more effectively, so you can succeed in school and life. You might be inclined to believe that—because you have been thinking for as long as you can remember, because you are able to figure out the solution to many problems, because you feel capable of using logic to argue a point, because you can evaluate whether the things you read and hear make sense—you do not need any special training in thinking. But this, of course, is one of the key barriers to helping people think better. If you do not believe that there is anything wrong, why try to fix it?

The human brain is indeed a remarkable thinking machine, capable of amazing, complex, creative, logical thoughts. Why, then, are we telling you that you need to learn how to think? Mainly because one major lesson from cognitive psychology is that these capabilities of the human brain are relatively infrequently realized. Many psychologists believe that people are essentially “cognitive misers.” It is not that we are lazy, but that we have a tendency to expend the least amount of mental effort necessary. Although you may not realize it, it actually takes a great deal of energy to think. Careful, deliberative reasoning and critical thinking are very difficult. Because we seem to be successful without going to the trouble of using these skills well, it feels unnecessary to develop them. As you shall see, however, there are many pitfalls in the cognitive processes described in this module. When people do not devote extra effort to learning and improving reasoning, problem solving, and critical thinking skills, they make many errors.

As is true for memory, if you develop the cognitive skills presented in this module, you will be more successful in school. It is important that you realize, however, that these skills will help you far beyond school, even more so than a good memory will. Although it is somewhat useful to have a good memory, ten years from now no potential employer will care how many questions you got right on multiple choice exams during college. All of them will, however, recognize whether you are a logical, analytical, critical thinker. With these thinking skills, you will be an effective, persuasive communicator and an excellent problem solver.

The module begins by describing different kinds of thought and knowledge, especially conceptual knowledge and critical thinking. An understanding of these differences will be valuable as you progress through school and encounter different assignments that require you to tap into different kinds of knowledge. The second section covers deductive and inductive reasoning, which are processes we use to construct and evaluate strong arguments. They are essential skills to have whenever you are trying to persuade someone (including yourself) of some point, or to respond to someone’s efforts to persuade you. The module ends with a section about problem solving. A solid understanding of the key processes involved in problem solving will help you to handle many daily challenges.

7.1. Different kinds of thought

7.2. Reasoning and Judgment

7.3. Problem Solving

READING WITH PURPOSE

Remember and understand.

By reading and studying Module 7, you should be able to remember and describe:

  • Concepts and inferences (7.1)
  • Procedural knowledge (7.1)
  • Metacognition (7.1)
  • Characteristics of critical thinking:  skepticism; identify biases, distortions, omissions, and assumptions; reasoning and problem solving skills  (7.1)
  • Reasoning:  deductive reasoning, deductively valid argument, inductive reasoning, inductively strong argument, availability heuristic, representativeness heuristic  (7.2)
  • Fixation:  functional fixedness, mental set  (7.3)
  • Algorithms, heuristics, and the role of confirmation bias (7.3)
  • Effective problem solving sequence (7.3)

By reading and thinking about how the concepts in Module 6 apply to real life, you should be able to:

  • Identify which type of knowledge a piece of information is (7.1)
  • Recognize examples of deductive and inductive reasoning (7.2)
  • Recognize judgments that have probably been influenced by the availability heuristic (7.2)
  • Recognize examples of problem solving heuristics and algorithms (7.3)

Analyze, Evaluate, and Create

By reading and thinking about Module 6, participating in classroom activities, and completing out-of-class assignments, you should be able to:

  • Use the principles of critical thinking to evaluate information (7.1)
  • Explain whether examples of reasoning arguments are deductively valid or inductively strong (7.2)
  • Outline how you could try to solve a problem from your life using the effective problem solving sequence (7.3)

7.1. Different kinds of thought and knowledge

  • Take a few minutes to write down everything that you know about dogs.
  • Do you believe that:
  • Psychic ability exists?
  • Hypnosis is an altered state of consciousness?
  • Magnet therapy is effective for relieving pain?
  • Aerobic exercise is an effective treatment for depression?
  • UFO’s from outer space have visited earth?

On what do you base your belief or disbelief for the questions above?

Of course, we all know what is meant by the words  think  and  knowledge . You probably also realize that they are not unitary concepts; there are different kinds of thought and knowledge. In this section, let us look at some of these differences. If you are familiar with these different kinds of thought and pay attention to them in your classes, it will help you to focus on the right goals, learn more effectively, and succeed in school. Different assignments and requirements in school call on you to use different kinds of knowledge or thought, so it will be very helpful for you to learn to recognize them (Anderson, et al. 2001).

Factual and conceptual knowledge

Module 5 introduced the idea of declarative memory, which is composed of facts and episodes. If you have ever played a trivia game or watched Jeopardy on TV, you realize that the human brain is able to hold an extraordinary number of facts. Likewise, you realize that each of us has an enormous store of episodes, essentially facts about events that happened in our own lives. It may be difficult to keep that in mind when we are struggling to retrieve one of those facts while taking an exam, however. Part of the problem is that, in contradiction to the advice from Module 5, many students continue to try to memorize course material as a series of unrelated facts (picture a history student simply trying to memorize history as a set of unrelated dates without any coherent story tying them together). Facts in the real world are not random and unorganized, however. It is the way that they are organized that constitutes a second key kind of knowledge, conceptual.

Concepts are nothing more than our mental representations of categories of things in the world. For example, think about dogs. When you do this, you might remember specific facts about dogs, such as they have fur and they bark. You may also recall dogs that you have encountered and picture them in your mind. All of this information (and more) makes up your concept of dog. You can have concepts of simple categories (e.g., triangle), complex categories (e.g., small dogs that sleep all day, eat out of the garbage, and bark at leaves), kinds of people (e.g., psychology professors), events (e.g., birthday parties), and abstract ideas (e.g., justice). Gregory Murphy (2002) refers to concepts as the “glue that holds our mental life together” (p. 1). Very simply, summarizing the world by using concepts is one of the most important cognitive tasks that we do. Our conceptual knowledge  is  our knowledge about the world. Individual concepts are related to each other to form a rich interconnected network of knowledge. For example, think about how the following concepts might be related to each other: dog, pet, play, Frisbee, chew toy, shoe. Or, of more obvious use to you now, how these concepts are related: working memory, long-term memory, declarative memory, procedural memory, and rehearsal? Because our minds have a natural tendency to organize information conceptually, when students try to remember course material as isolated facts, they are working against their strengths.

One last important point about concepts is that they allow you to instantly know a great deal of information about something. For example, if someone hands you a small red object and says, “here is an apple,” they do not have to tell you, “it is something you can eat.” You already know that you can eat it because it is true by virtue of the fact that the object is an apple; this is called drawing an  inference , assuming that something is true on the basis of your previous knowledge (for example, of category membership or of how the world works) or logical reasoning.

Procedural knowledge

Physical skills, such as tying your shoes, doing a cartwheel, and driving a car (or doing all three at the same time, but don’t try this at home) are certainly a kind of knowledge. They are procedural knowledge, the same idea as procedural memory that you saw in Module 5. Mental skills, such as reading, debating, and planning a psychology experiment, are procedural knowledge, as well. In short, procedural knowledge is the knowledge how to do something (Cohen & Eichenbaum, 1993).

Metacognitive knowledge

Floyd used to think that he had a great memory. Now, he has a better memory. Why? Because he finally realized that his memory was not as great as he once thought it was. Because Floyd eventually learned that he often forgets where he put things, he finally developed the habit of putting things in the same place. (Unfortunately, he did not learn this lesson before losing at least 5 watches and a wedding ring.) Because he finally realized that he often forgets to do things, he finally started using the To Do list app on his phone. And so on. Floyd’s insights about the real limitations of his memory have allowed him to remember things that he used to forget.

All of us have knowledge about the way our own minds work. You may know that you have a good memory for people’s names and a poor memory for math formulas. Someone else might realize that they have difficulty remembering to do things, like stopping at the store on the way home. Others still know that they tend to overlook details. This knowledge about our own thinking is actually quite important; it is called metacognitive knowledge, or  metacognition . Like other kinds of thinking skills, it is subject to error. For example, in unpublished research, one of the authors surveyed about 120 General Psychology students on the first day of the term. Among other questions, the students were asked them to predict their grade in the class and report their current Grade Point Average. Two-thirds of the students predicted that their grade in the course would be higher than their GPA. (The reality is that at our college, students tend to earn lower grades in psychology than their overall GPA.) Another example: Students routinely report that they thought they had done well on an exam, only to discover, to their dismay, that they were wrong (more on that important problem in a moment). Both errors reveal a breakdown in metacognition.

The Dunning-Kruger Effect

In general, most college students probably do not study enough. For example, using data from the National Survey of Student Engagement, Fosnacht, McCormack, and Lerma (2018) reported that first-year students at 4-year colleges in the U.S. averaged less than 14 hours per week preparing for classes. The typical suggestion is that you should spend two hours outside of class for every hour in class, or 24 – 30 hours per week for a full-time student. Clearly, students in general are nowhere near that recommended mark. Many observers, including some faculty, believe that this shortfall is a result of students being too busy or lazy. Now, it may be true that many students are too busy, with work and family obligations, for example. Others, are not particularly motivated in school, and therefore might correctly be labeled lazy. A third possible explanation, however, is that some students might not think they need to spend this much time. And this is a matter of metacognition. Consider the scenario that we mentioned above, students thinking they had done well on an exam only to discover that they did not. Justin Kruger and David Dunning examined scenarios very much like this in 1999. Kruger and Dunning gave research participants tests measuring humor, logic, and grammar. Then, they asked the participants to assess their own abilities and test performance in these areas. They found that participants in general tended to overestimate their abilities, already a problem with metacognition. Importantly, the participants who scored the lowest overestimated their abilities the most. Specifically, students who scored in the bottom quarter (averaging in the 12th percentile) thought they had scored in the 62nd percentile. This has become known as the  Dunning-Kruger effect . Many individual faculty members have replicated these results with their own student on their course exams, including the authors of this book. Think about it. Some students who just took an exam and performed poorly believe that they did well before seeing their score. It seems very likely that these are the very same students who stopped studying the night before because they thought they were “done.” Quite simply, it is not just that they did not know the material. They did not know that they did not know the material. That is poor metacognition.

In order to develop good metacognitive skills, you should continually monitor your thinking and seek frequent feedback on the accuracy of your thinking (Medina, Castleberry, & Persky 2017). For example, in classes get in the habit of predicting your exam grades. As soon as possible after taking an exam, try to find out which questions you missed and try to figure out why. If you do this soon enough, you may be able to recall the way it felt when you originally answered the question. Did you feel confident that you had answered the question correctly? Then you have just discovered an opportunity to improve your metacognition. Be on the lookout for that feeling and respond with caution.

concept :  a mental representation of a category of things in the world

Dunning-Kruger effect : individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

inference : an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

metacognition :  knowledge about one’s own cognitive processes; thinking about your thinking

Critical thinking

One particular kind of knowledge or thinking skill that is related to metacognition is  critical thinking (Chew, 2020). You may have noticed that critical thinking is an objective in many college courses, and thus it could be a legitimate topic to cover in nearly any college course. It is particularly appropriate in psychology, however. As the science of (behavior and) mental processes, psychology is obviously well suited to be the discipline through which you should be introduced to this important way of thinking.

More importantly, there is a particular need to use critical thinking in psychology. We are all, in a way, experts in human behavior and mental processes, having engaged in them literally since birth. Thus, perhaps more than in any other class, students typically approach psychology with very clear ideas and opinions about its subject matter. That is, students already “know” a lot about psychology. The problem is, “it ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so” (Ward, quoted in Gilovich 1991). Indeed, many of students’ preconceptions about psychology are just plain wrong. Randolph Smith (2002) wrote a book about critical thinking in psychology called  Challenging Your Preconceptions,  highlighting this fact. On the other hand, many of students’ preconceptions about psychology are just plain right! But wait, how do you know which of your preconceptions are right and which are wrong? And when you come across a research finding or theory in this class that contradicts your preconceptions, what will you do? Will you stick to your original idea, discounting the information from the class? Will you immediately change your mind? Critical thinking can help us sort through this confusing mess.

But what is critical thinking? The goal of critical thinking is simple to state (but extraordinarily difficult to achieve): it is to be right, to draw the correct conclusions, to believe in things that are true and to disbelieve things that are false. We will provide two definitions of critical thinking (or, if you like, one large definition with two distinct parts). First, a more conceptual one: Critical thinking is thinking like a scientist in your everyday life (Schmaltz, Jansen, & Wenckowski, 2017).  Our second definition is more operational; it is simply a list of skills that are essential to be a critical thinker. Critical thinking entails solid reasoning and problem solving skills; skepticism; and an ability to identify biases, distortions, omissions, and assumptions. Excellent deductive and inductive reasoning, and problem solving skills contribute to critical thinking. So, you can consider the subject matter of sections 7.2 and 7.3 to be part of critical thinking. Because we will be devoting considerable time to these concepts in the rest of the module, let us begin with a discussion about the other aspects of critical thinking.

Let’s address that first part of the definition. Scientists form hypotheses, or predictions about some possible future observations. Then, they collect data, or information (think of this as making those future observations). They do their best to make unbiased observations using reliable techniques that have been verified by others. Then, and only then, they draw a conclusion about what those observations mean. Oh, and do not forget the most important part. “Conclusion” is probably not the most appropriate word because this conclusion is only tentative. A scientist is always prepared that someone else might come along and produce new observations that would require a new conclusion be drawn. Wow! If you like to be right, you could do a lot worse than using a process like this.

A Critical Thinker’s Toolkit 

Now for the second part of the definition. Good critical thinkers (and scientists) rely on a variety of tools to evaluate information. Perhaps the most recognizable tool for critical thinking is  skepticism (and this term provides the clearest link to the thinking like a scientist definition, as you are about to see). Some people intend it as an insult when they call someone a skeptic. But if someone calls you a skeptic, if they are using the term correctly, you should consider it a great compliment. Simply put, skepticism is a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided. People from Missouri should recognize this principle, as Missouri is known as the Show-Me State. As a skeptic, you are not inclined to believe something just because someone said so, because someone else believes it, or because it sounds reasonable. You must be persuaded by high quality evidence.

Of course, if that evidence is produced, you have a responsibility as a skeptic to change your belief. Failure to change a belief in the face of good evidence is not skepticism; skepticism has open mindedness at its core. M. Neil Browne and Stuart Keeley (2018) use the term weak sense critical thinking to describe critical thinking behaviors that are used only to strengthen a prior belief. Strong sense critical thinking, on the other hand, has as its goal reaching the best conclusion. Sometimes that means strengthening your prior belief, but sometimes it means changing your belief to accommodate the better evidence.

Many times, a failure to think critically or weak sense critical thinking is related to a  bias , an inclination, tendency, leaning, or prejudice. Everybody has biases, but many people are unaware of them. Awareness of your own biases gives you the opportunity to control or counteract them. Unfortunately, however, many people are happy to let their biases creep into their attempts to persuade others; indeed, it is a key part of their persuasive strategy. To see how these biases influence messages, just look at the different descriptions and explanations of the same events given by people of different ages or income brackets, or conservative versus liberal commentators, or by commentators from different parts of the world. Of course, to be successful, these people who are consciously using their biases must disguise them. Even undisguised biases can be difficult to identify, so disguised ones can be nearly impossible.

Here are some common sources of biases:

  • Personal values and beliefs.  Some people believe that human beings are basically driven to seek power and that they are typically in competition with one another over scarce resources. These beliefs are similar to the world-view that political scientists call “realism.” Other people believe that human beings prefer to cooperate and that, given the chance, they will do so. These beliefs are similar to the world-view known as “idealism.” For many people, these deeply held beliefs can influence, or bias, their interpretations of such wide ranging situations as the behavior of nations and their leaders or the behavior of the driver in the car ahead of you. For example, if your worldview is that people are typically in competition and someone cuts you off on the highway, you may assume that the driver did it purposely to get ahead of you. Other types of beliefs about the way the world is or the way the world should be, for example, political beliefs, can similarly become a significant source of bias.
  • Racism, sexism, ageism and other forms of prejudice and bigotry.  These are, sadly, a common source of bias in many people. They are essentially a special kind of “belief about the way the world is.” These beliefs—for example, that women do not make effective leaders—lead people to ignore contradictory evidence (examples of effective women leaders, or research that disputes the belief) and to interpret ambiguous evidence in a way consistent with the belief.
  • Self-interest.  When particular people benefit from things turning out a certain way, they can sometimes be very susceptible to letting that interest bias them. For example, a company that will earn a profit if they sell their product may have a bias in the way that they give information about their product. A union that will benefit if its members get a generous contract might have a bias in the way it presents information about salaries at competing organizations. (Note that our inclusion of examples describing both companies and unions is an explicit attempt to control for our own personal biases). Home buyers are often dismayed to discover that they purchased their dream house from someone whose self-interest led them to lie about flooding problems in the basement or back yard. This principle, the biasing power of self-interest, is likely what led to the famous phrase  Caveat Emptor  (let the buyer beware) .  

Knowing that these types of biases exist will help you evaluate evidence more critically. Do not forget, though, that people are not always keen to let you discover the sources of biases in their arguments. For example, companies or political organizations can sometimes disguise their support of a research study by contracting with a university professor, who comes complete with a seemingly unbiased institutional affiliation, to conduct the study.

People’s biases, conscious or unconscious, can lead them to make omissions, distortions, and assumptions that undermine our ability to correctly evaluate evidence. It is essential that you look for these elements. Always ask, what is missing, what is not as it appears, and what is being assumed here? For example, consider this (fictional) chart from an ad reporting customer satisfaction at 4 local health clubs.

reasoning and problem solving in cognitive psychology

Clearly, from the results of the chart, one would be tempted to give Club C a try, as customer satisfaction is much higher than for the other 3 clubs.

There are so many distortions and omissions in this chart, however, that it is actually quite meaningless. First, how was satisfaction measured? Do the bars represent responses to a survey? If so, how were the questions asked? Most importantly, where is the missing scale for the chart? Although the differences look quite large, are they really?

Well, here is the same chart, with a different scale, this time labeled:

reasoning and problem solving in cognitive psychology

Club C is not so impressive any more, is it? In fact, all of the health clubs have customer satisfaction ratings (whatever that means) between 85% and 88%. In the first chart, the entire scale of the graph included only the percentages between 83 and 89. This “judicious” choice of scale—some would call it a distortion—and omission of that scale from the chart make the tiny differences among the clubs seem important, however.

Also, in order to be a critical thinker, you need to learn to pay attention to the assumptions that underlie a message. Let us briefly illustrate the role of assumptions by touching on some people’s beliefs about the criminal justice system in the US. Some believe that a major problem with our judicial system is that many criminals go free because of legal technicalities. Others believe that a major problem is that many innocent people are convicted of crimes. The simple fact is, both types of errors occur. A person’s conclusion about which flaw in our judicial system is the greater tragedy is based on an assumption about which of these is the more serious error (letting the guilty go free or convicting the innocent). This type of assumption is called a value assumption (Browne and Keeley, 2018). It reflects the differences in values that people develop, differences that may lead us to disregard valid evidence that does not fit in with our particular values.

Oh, by the way, some students probably noticed this, but the seven tips for evaluating information that we shared in Module 1 are related to this. Actually, they are part of this section. The tips are, to a very large degree, set of ideas you can use to help you identify biases, distortions, omissions, and assumptions. If you do not remember this section, we strongly recommend you take a few minutes to review it.

skepticism :  a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

bias : an inclination, tendency, leaning, or prejudice

  • Which of your beliefs (or disbeliefs) from the Activate exercise for this section were derived from a process of critical thinking? If some of your beliefs were not based on critical thinking, are you willing to reassess these beliefs? If the answer is no, why do you think that is? If the answer is yes, what concrete steps will you take?

7.2 Reasoning and Judgment

  • What percentage of kidnappings are committed by strangers?
  • Which area of the house is riskiest: kitchen, bathroom, or stairs?
  • What is the most common cancer in the US?
  • What percentage of workplace homicides are committed by co-workers?

An essential set of procedural thinking skills is  reasoning , the ability to generate and evaluate solid conclusions from a set of statements or evidence. You should note that these conclusions (when they are generated instead of being evaluated) are one key type of inference that we described in Section 7.1. There are two main types of reasoning, deductive and inductive.

Deductive reasoning

Suppose your teacher tells you that if you get an A on the final exam in a course, you will get an A for the whole course. Then, you get an A on the final exam. What will your final course grade be? Most people can see instantly that you can conclude with certainty that you will get an A for the course. This is a type of reasoning called  deductive reasoning , which is defined as reasoning in which a conclusion is guaranteed to be true as long as the statements leading to it are true. The three statements can be listed as an  argument , with two beginning statements and a conclusion:

Statement 1: If you get an A on the final exam, you will get an A for the course

Statement 2: You get an A on the final exam

Conclusion: You will get an A for the course

This particular arrangement, in which true beginning statements lead to a guaranteed true conclusion, is known as a  deductively valid argument . Although deductive reasoning is often the subject of abstract, brain-teasing, puzzle-like word problems, it is actually an extremely important type of everyday reasoning. It is just hard to recognize sometimes. For example, imagine that you are looking for your car keys and you realize that they are either in the kitchen drawer or in your book bag. After looking in the kitchen drawer, you instantly know that they must be in your book bag. That conclusion results from a simple deductive reasoning argument. In addition, solid deductive reasoning skills are necessary for you to succeed in the sciences, philosophy, math, computer programming, and any endeavor involving the use of logic to persuade others to your point of view or to evaluate others’ arguments.

Cognitive psychologists, and before them philosophers, have been quite interested in deductive reasoning, not so much for its practical applications, but for the insights it can offer them about the ways that human beings think. One of the early ideas to emerge from the examination of deductive reasoning is that people learn (or develop) mental versions of rules that allow them to solve these types of reasoning problems (Braine, 1978; Braine, Reiser, & Rumain, 1984). The best way to see this point of view is to realize that there are different possible rules, and some of them are very simple. For example, consider this rule of logic:

therefore q

Logical rules are often presented abstractly, as letters, in order to imply that they can be used in very many specific situations. Here is a concrete version of the of the same rule:

I’ll either have pizza or a hamburger for dinner tonight (p or q)

I won’t have pizza (not p)

Therefore, I’ll have a hamburger (therefore q)

This kind of reasoning seems so natural, so easy, that it is quite plausible that we would use a version of this rule in our daily lives. At least, it seems more plausible than some of the alternative possibilities—for example, that we need to have experience with the specific situation (pizza or hamburger, in this case) in order to solve this type of problem easily. So perhaps there is a form of natural logic (Rips, 1990) that contains very simple versions of logical rules. When we are faced with a reasoning problem that maps onto one of these rules, we use the rule.

But be very careful; things are not always as easy as they seem. Even these simple rules are not so simple. For example, consider the following rule. Many people fail to realize that this rule is just as valid as the pizza or hamburger rule above.

if p, then q

therefore, not p

Concrete version:

If I eat dinner, then I will have dessert

I did not have dessert

Therefore, I did not eat dinner

The simple fact is, it can be very difficult for people to apply rules of deductive logic correctly; as a result, they make many errors when trying to do so. Is this a deductively valid argument or not?

Students who like school study a lot

Students who study a lot get good grades

Jane does not like school

Therefore, Jane does not get good grades

Many people are surprised to discover that this is not a logically valid argument; the conclusion is not guaranteed to be true from the beginning statements. Although the first statement says that students who like school study a lot, it does NOT say that students who do not like school do not study a lot. In other words, it may very well be possible to study a lot without liking school. Even people who sometimes get problems like this right might not be using the rules of deductive reasoning. Instead, they might just be making judgments for examples they know, in this case, remembering instances of people who get good grades despite not liking school.

Making deductive reasoning even more difficult is the fact that there are two important properties that an argument may have. One, it can be valid or invalid (meaning that the conclusion does or does not follow logically from the statements leading up to it). Two, an argument (or more correctly, its conclusion) can be true or false. Here is an example of an argument that is logically valid, but has a false conclusion (at least we think it is false).

Either you are eleven feet tall or the Grand Canyon was created by a spaceship crashing into the earth.

You are not eleven feet tall

Therefore the Grand Canyon was created by a spaceship crashing into the earth

This argument has the exact same form as the pizza or hamburger argument above, making it is deductively valid. The conclusion is so false, however, that it is absurd (of course, the reason the conclusion is false is that the first statement is false). When people are judging arguments, they tend to not observe the difference between deductive validity and the empirical truth of statements or conclusions. If the elements of an argument happen to be true, people are likely to judge the argument logically valid; if the elements are false, they will very likely judge it invalid (Markovits & Bouffard-Bouchard, 1992; Moshman & Franks, 1986). Thus, it seems a stretch to say that people are using these logical rules to judge the validity of arguments. Many psychologists believe that most people actually have very limited deductive reasoning skills (Johnson-Laird, 1999). They argue that when faced with a problem for which deductive logic is required, people resort to some simpler technique, such as matching terms that appear in the statements and the conclusion (Evans, 1982). This might not seem like a problem, but what if reasoners believe that the elements are true and they happen to be wrong; they will would believe that they are using a form of reasoning that guarantees they are correct and yet be wrong.

deductive reasoning :  a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

argument :  a set of statements in which the beginning statements lead to a conclusion

deductively valid argument :  an argument for which true beginning statements guarantee that the conclusion is true

Inductive reasoning and judgment

Every day, you make many judgments about the likelihood of one thing or another. Whether you realize it or not, you are practicing  inductive reasoning   on a daily basis. In inductive reasoning arguments, a conclusion is likely whenever the statements preceding it are true. The first thing to notice about inductive reasoning is that, by definition, you can never be sure about your conclusion; you can only estimate how likely the conclusion is. Inductive reasoning may lead you to focus on Memory Encoding and Recoding when you study for the exam, but it is possible the instructor will ask more questions about Memory Retrieval instead. Unlike deductive reasoning, the conclusions you reach through inductive reasoning are only probable, not certain. That is why scientists consider inductive reasoning weaker than deductive reasoning. But imagine how hard it would be for us to function if we could not act unless we were certain about the outcome.

Inductive reasoning can be represented as logical arguments consisting of statements and a conclusion, just as deductive reasoning can be. In an inductive argument, you are given some statements and a conclusion (or you are given some statements and must draw a conclusion). An argument is  inductively strong   if the conclusion would be very probable whenever the statements are true. So, for example, here is an inductively strong argument:

  • Statement #1: The forecaster on Channel 2 said it is going to rain today.
  • Statement #2: The forecaster on Channel 5 said it is going to rain today.
  • Statement #3: It is very cloudy and humid.
  • Statement #4: You just heard thunder.
  • Conclusion (or judgment): It is going to rain today.

Think of the statements as evidence, on the basis of which you will draw a conclusion. So, based on the evidence presented in the four statements, it is very likely that it will rain today. Will it definitely rain today? Certainly not. We can all think of times that the weather forecaster was wrong.

A true story: Some years ago psychology student was watching a baseball playoff game between the St. Louis Cardinals and the Los Angeles Dodgers. A graphic on the screen had just informed the audience that the Cardinal at bat, (Hall of Fame shortstop) Ozzie Smith, a switch hitter batting left-handed for this plate appearance, had never, in nearly 3000 career at-bats, hit a home run left-handed. The student, who had just learned about inductive reasoning in his psychology class, turned to his companion (a Cardinals fan) and smugly said, “It is an inductively strong argument that Ozzie Smith will not hit a home run.” He turned back to face the television just in time to watch the ball sail over the right field fence for a home run. Although the student felt foolish at the time, he was not wrong. It was an inductively strong argument; 3000 at-bats is an awful lot of evidence suggesting that the Wizard of Ozz (as he was known) would not be hitting one out of the park (think of each at-bat without a home run as a statement in an inductive argument). Sadly (for the die-hard Cubs fan and Cardinals-hating student), despite the strength of the argument, the conclusion was wrong.

Given the possibility that we might draw an incorrect conclusion even with an inductively strong argument, we really want to be sure that we do, in fact, make inductively strong arguments. If we judge something probable, it had better be probable. If we judge something nearly impossible, it had better not happen. Think of inductive reasoning, then, as making reasonably accurate judgments of the probability of some conclusion given a set of evidence.

We base many decisions in our lives on inductive reasoning. For example:

Statement #1: Psychology is not my best subject

Statement #2: My psychology instructor has a reputation for giving difficult exams

Statement #3: My first psychology exam was much harder than I expected

Judgment: The next exam will probably be very difficult.

Decision: I will study tonight instead of watching Netflix.

Some other examples of judgments that people commonly make in a school context include judgments of the likelihood that:

  • A particular class will be interesting/useful/difficult
  • You will be able to finish writing a paper by next week if you go out tonight
  • Your laptop’s battery will last through the next trip to the library
  • You will not miss anything important if you skip class tomorrow
  • Your instructor will not notice if you skip class tomorrow
  • You will be able to find a book that you will need for a paper
  • There will be an essay question about Memory Encoding on the next exam

Tversky and Kahneman (1983) recognized that there are two general ways that we might make these judgments; they termed them extensional (i.e., following the laws of probability) and intuitive (i.e., using shortcuts or heuristics, see below). We will use a similar distinction between Type 1 and Type 2 thinking, as described by Keith Stanovich and his colleagues (Evans and Stanovich, 2013; Stanovich and West, 2000). Type 1 thinking is fast, automatic, effortful, and emotional. In fact, it is hardly fair to call it reasoning at all, as judgments just seem to pop into one’s head. Type 2 thinking , on the other hand, is slow, effortful, and logical. So obviously, it is more likely to lead to a correct judgment, or an optimal decision. The problem is, we tend to over-rely on Type 1. Now, we are not saying that Type 2 is the right way to go for every decision or judgment we make. It seems a bit much, for example, to engage in a step-by-step logical reasoning procedure to decide whether we will have chicken or fish for dinner tonight.

Many bad decisions in some very important contexts, however, can be traced back to poor judgments of the likelihood of certain risks or outcomes that result from the use of Type 1 when a more logical reasoning process would have been more appropriate. For example:

Statement #1: It is late at night.

Statement #2: Albert has been drinking beer for the past five hours at a party.

Statement #3: Albert is not exactly sure where he is or how far away home is.

Judgment: Albert will have no difficulty walking home.

Decision: He walks home alone.

As you can see in this example, the three statements backing up the judgment do not really support it. In other words, this argument is not inductively strong because it is based on judgments that ignore the laws of probability. What are the chances that someone facing these conditions will be able to walk home alone easily? And one need not be drunk to make poor decisions based on judgments that just pop into our heads.

The truth is that many of our probability judgments do not come very close to what the laws of probability say they should be. Think about it. In order for us to reason in accordance with these laws, we would need to know the laws of probability, which would allow us to calculate the relationship between particular pieces of evidence and the probability of some outcome (i.e., how much likelihood should change given a piece of evidence), and we would have to do these heavy math calculations in our heads. After all, that is what Type 2 requires. Needless to say, even if we were motivated, we often do not even know how to apply Type 2 reasoning in many cases.

So what do we do when we don’t have the knowledge, skills, or time required to make the correct mathematical judgment? Do we hold off and wait until we can get better evidence? Do we read up on probability and fire up our calculator app so we can compute the correct probability? Of course not. We rely on Type 1 thinking. We “wing it.” That is, we come up with a likelihood estimate using some means at our disposal. Psychologists use the term heuristic to describe the type of “winging it” we are talking about. A  heuristic   is a shortcut strategy that we use to make some judgment or solve some problem (see Section 7.3). Heuristics are easy and quick, think of them as the basic procedures that are characteristic of Type 1.  They can absolutely lead to reasonably good judgments and decisions in some situations (like choosing between chicken and fish for dinner). They are, however, far from foolproof. There are, in fact, quite a lot of situations in which heuristics can lead us to make incorrect judgments, and in many cases the decisions based on those judgments can have serious consequences.

Let us return to the activity that begins this section. You were asked to judge the likelihood (or frequency) of certain events and risks. You were free to come up with your own evidence (or statements) to make these judgments. This is where a heuristic crops up. As a judgment shortcut, we tend to generate specific examples of those very events to help us decide their likelihood or frequency. For example, if we are asked to judge how common, frequent, or likely a particular type of cancer is, many of our statements would be examples of specific cancer cases:

Statement #1: Andy Kaufman (comedian) had lung cancer.

Statement #2: Colin Powell (US Secretary of State) had prostate cancer.

Statement #3: Bob Marley (musician) had skin and brain cancer

Statement #4: Sandra Day O’Connor (Supreme Court Justice) had breast cancer.

Statement #5: Fred Rogers (children’s entertainer) had stomach cancer.

Statement #6: Robin Roberts (news anchor) had breast cancer.

Statement #7: Bette Davis (actress) had breast cancer.

Judgment: Breast cancer is the most common type.

Your own experience or memory may also tell you that breast cancer is the most common type. But it is not (although it is common). Actually, skin cancer is the most common type in the US. We make the same types of misjudgments all the time because we do not generate the examples or evidence according to their actual frequencies or probabilities. Instead, we have a tendency (or bias) to search for the examples in memory; if they are easy to retrieve, we assume that they are common. To rephrase this in the language of the heuristic, events seem more likely to the extent that they are available to memory. This bias has been termed the  availability heuristic   (Kahneman and Tversky, 1974).

The fact that we use the availability heuristic does not automatically mean that our judgment is wrong. The reason we use heuristics in the first place is that they work fairly well in many cases (and, of course that they are easy to use). So, the easiest examples to think of sometimes are the most common ones. Is it more likely that a member of the U.S. Senate is a man or a woman? Most people have a much easier time generating examples of male senators. And as it turns out, the U.S. Senate has many more men than women (74 to 26 in 2020). In this case, then, the availability heuristic would lead you to make the correct judgment; it is far more likely that a senator would be a man.

In many other cases, however, the availability heuristic will lead us astray. This is because events can be memorable for many reasons other than their frequency. Section 5.2, Encoding Meaning, suggested that one good way to encode the meaning of some information is to form a mental image of it. Thus, information that has been pictured mentally will be more available to memory. Indeed, an event that is vivid and easily pictured will trick many people into supposing that type of event is more common than it actually is. Repetition of information will also make it more memorable. So, if the same event is described to you in a magazine, on the evening news, on a podcast that you listen to, and in your Facebook feed; it will be very available to memory. Again, the availability heuristic will cause you to misperceive the frequency of these types of events.

Most interestingly, information that is unusual is more memorable. Suppose we give you the following list of words to remember: box, flower, letter, platypus, oven, boat, newspaper, purse, drum, car. Very likely, the easiest word to remember would be platypus, the unusual one. The same thing occurs with memories of events. An event may be available to memory because it is unusual, yet the availability heuristic leads us to judge that the event is common. Did you catch that? In these cases, the availability heuristic makes us think the exact opposite of the true frequency. We end up thinking something is common because it is unusual (and therefore memorable). Yikes.

The misapplication of the availability heuristic sometimes has unfortunate results. For example, if you went to K-12 school in the US over the past 10 years, it is extremely likely that you have participated in lockdown and active shooter drills. Of course, everyone is trying to prevent the tragedy of another school shooting. And believe us, we are not trying to minimize how terrible the tragedy is. But the truth of the matter is, school shootings are extremely rare. Because the federal government does not keep a database of school shootings, the Washington Post has maintained their own running tally. Between 1999 and January 2020 (the date of the most recent school shooting with a death in the US at of the time this paragraph was written), the Post reported a total of 254 people died in school shootings in the US. Not 254 per year, 254 total. That is an average of 12 per year. Of course, that is 254 people who should not have died (particularly because many were children), but in a country with approximately 60,000,000 students and teachers, this is a very small risk.

But many students and teachers are terrified that they will be victims of school shootings because of the availability heuristic. It is so easy to think of examples (they are very available to memory) that people believe the event is very common. It is not. And there is a downside to this. We happen to believe that there is an enormous gun violence problem in the United States. According the the Centers for Disease Control and Prevention, there were 39,773 firearm deaths in the US in 2017. Fifteen of those deaths were in school shootings, according to the Post. 60% of those deaths were suicides. When people pay attention to the school shooting risk (low), they often fail to notice the much larger risk.

And examples like this are by no means unique. The authors of this book have been teaching psychology since the 1990’s. We have been able to make the exact same arguments about the misapplication of the availability heuristics and keep them current by simply swapping out for the “fear of the day.” In the 1990’s it was children being kidnapped by strangers (it was known as “stranger danger”) despite the facts that kidnappings accounted for only 2% of the violent crimes committed against children, and only 24% of kidnappings are committed by strangers (US Department of Justice, 2007). This fear overlapped with the fear of terrorism that gripped the country after the 2001 terrorist attacks on the World Trade Center and US Pentagon and still plagues the population of the US somewhat in 2020. After a well-publicized, sensational act of violence, people are extremely likely to increase their estimates of the chances that they, too, will be victims of terror. Think about the reality, however. In October of 2001, a terrorist mailed anthrax spores to members of the US government and a number of media companies. A total of five people died as a result of this attack. The nation was nearly paralyzed by the fear of dying from the attack; in reality the probability of an individual person dying was 0.00000002.

The availability heuristic can lead you to make incorrect judgments in a school setting as well. For example, suppose you are trying to decide if you should take a class from a particular math professor. You might try to make a judgment of how good a teacher she is by recalling instances of friends and acquaintances making comments about her teaching skill. You may have some examples that suggest that she is a poor teacher very available to memory, so on the basis of the availability heuristic you judge her a poor teacher and decide to take the class from someone else. What if, however, the instances you recalled were all from the same person, and this person happens to be a very colorful storyteller? The subsequent ease of remembering the instances might not indicate that the professor is a poor teacher after all.

Although the availability heuristic is obviously important, it is not the only judgment heuristic we use. Amos Tversky and Daniel Kahneman examined the role of heuristics in inductive reasoning in a long series of studies. Kahneman received a Nobel Prize in Economics for this research in 2002, and Tversky would have certainly received one as well if he had not died of melanoma at age 59 in 1996 (Nobel Prizes are not awarded posthumously). Kahneman and Tversky demonstrated repeatedly that people do not reason in ways that are consistent with the laws of probability. They identified several heuristic strategies that people use instead to make judgments about likelihood. The importance of this work for economics (and the reason that Kahneman was awarded the Nobel Prize) is that earlier economic theories had assumed that people do make judgments rationally, that is, in agreement with the laws of probability.

Another common heuristic that people use for making judgments is the  representativeness heuristic (Kahneman & Tversky 1973). Suppose we describe a person to you. He is quiet and shy, has an unassuming personality, and likes to work with numbers. Is this person more likely to be an accountant or an attorney? If you said accountant, you were probably using the representativeness heuristic. Our imaginary person is judged likely to be an accountant because he resembles, or is representative of the concept of, an accountant. When research participants are asked to make judgments such as these, the only thing that seems to matter is the representativeness of the description. For example, if told that the person described is in a room that contains 70 attorneys and 30 accountants, participants will still assume that he is an accountant.

inductive reasoning :  a type of reasoning in which we make judgments about likelihood from sets of evidence

inductively strong argument :  an inductive argument in which the beginning statements lead to a conclusion that is probably true

heuristic :  a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

availability heuristic :  judging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

representativeness heuristic:   judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

Type 1 thinking : fast, automatic, and emotional thinking.

Type 2 thinking : slow, effortful, and logical thinking.

  • What percentage of workplace homicides are co-worker violence?

Many people get these questions wrong. The answers are 10%; stairs; skin; 6%. How close were your answers? Explain how the availability heuristic might have led you to make the incorrect judgments.

  • Can you think of some other judgments that you have made (or beliefs that you have) that might have been influenced by the availability heuristic?

7.3 Problem Solving

  • Please take a few minutes to list a number of problems that you are facing right now.
  • Now write about a problem that you recently solved.
  • What is your definition of a problem?

Mary has a problem. Her daughter, ordinarily quite eager to please, appears to delight in being the last person to do anything. Whether getting ready for school, going to piano lessons or karate class, or even going out with her friends, she seems unwilling or unable to get ready on time. Other people have different kinds of problems. For example, many students work at jobs, have numerous family commitments, and are facing a course schedule full of difficult exams, assignments, papers, and speeches. How can they find enough time to devote to their studies and still fulfill their other obligations? Speaking of students and their problems: Show that a ball thrown vertically upward with initial velocity v0 takes twice as much time to return as to reach the highest point (from Spiegel, 1981).

These are three very different situations, but we have called them all problems. What makes them all the same, despite the differences? A psychologist might define a  problem   as a situation with an initial state, a goal state, and a set of possible intermediate states. Somewhat more meaningfully, we might consider a problem a situation in which you are in here one state (e.g., daughter is always late), you want to be there in another state (e.g., daughter is not always late), and with no obvious way to get from here to there. Defined this way, each of the three situations we outlined can now be seen as an example of the same general concept, a problem. At this point, you might begin to wonder what is not a problem, given such a general definition. It seems that nearly every non-routine task we engage in could qualify as a problem. As long as you realize that problems are not necessarily bad (it can be quite fun and satisfying to rise to the challenge and solve a problem), this may be a useful way to think about it.

Can we identify a set of problem-solving skills that would apply to these very different kinds of situations? That task, in a nutshell, is a major goal of this section. Let us try to begin to make sense of the wide variety of ways that problems can be solved with an important observation: the process of solving problems can be divided into two key parts. First, people have to notice, comprehend, and represent the problem properly in their minds (called  problem representation ). Second, they have to apply some kind of solution strategy to the problem. Psychologists have studied both of these key parts of the process in detail.

When you first think about the problem-solving process, you might guess that most of our difficulties would occur because we are failing in the second step, the application of strategies. Although this can be a significant difficulty much of the time, the more important source of difficulty is probably problem representation. In short, we often fail to solve a problem because we are looking at it, or thinking about it, the wrong way.

problem :  a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

problem representation :  noticing, comprehending and forming a mental conception of a problem

Defining and Mentally Representing Problems in Order to Solve Them

So, the main obstacle to solving a problem is that we do not clearly understand exactly what the problem is. Recall the problem with Mary’s daughter always being late. One way to represent, or to think about, this problem is that she is being defiant. She refuses to get ready in time. This type of representation or definition suggests a particular type of solution. Another way to think about the problem, however, is to consider the possibility that she is simply being sidetracked by interesting diversions. This different conception of what the problem is (i.e., different representation) suggests a very different solution strategy. For example, if Mary defines the problem as defiance, she may be tempted to solve the problem using some kind of coercive tactics, that is, to assert her authority as her mother and force her to listen. On the other hand, if Mary defines the problem as distraction, she may try to solve it by simply removing the distracting objects.

As you might guess, when a problem is represented one way, the solution may seem very difficult, or even impossible. Seen another way, the solution might be very easy. For example, consider the following problem (from Nasar, 1998):

Two bicyclists start 20 miles apart and head toward each other, each going at a steady rate of 10 miles per hour. At the same time, a fly that travels at a steady 15 miles per hour starts from the front wheel of the southbound bicycle and flies to the front wheel of the northbound one, then turns around and flies to the front wheel of the southbound one again, and continues in this manner until he is crushed between the two front wheels. Question: what total distance did the fly cover?

Please take a few minutes to try to solve this problem.

Most people represent this problem as a question about a fly because, well, that is how the question is asked. The solution, using this representation, is to figure out how far the fly travels on the first leg of its journey, then add this total to how far it travels on the second leg of its journey (when it turns around and returns to the first bicycle), then continue to add the smaller distance from each leg of the journey until you converge on the correct answer. You would have to be quite skilled at math to solve this problem, and you would probably need some time and pencil and paper to do it.

If you consider a different representation, however, you can solve this problem in your head. Instead of thinking about it as a question about a fly, think about it as a question about the bicycles. They are 20 miles apart, and each is traveling 10 miles per hour. How long will it take for the bicycles to reach each other? Right, one hour. The fly is traveling 15 miles per hour; therefore, it will travel a total of 15 miles back and forth in the hour before the bicycles meet. Represented one way (as a problem about a fly), the problem is quite difficult. Represented another way (as a problem about two bicycles), it is easy. Changing your representation of a problem is sometimes the best—sometimes the only—way to solve it.

Unfortunately, however, changing a problem’s representation is not the easiest thing in the world to do. Often, problem solvers get stuck looking at a problem one way. This is called  fixation . Most people who represent the preceding problem as a problem about a fly probably do not pause to reconsider, and consequently change, their representation. A parent who thinks her daughter is being defiant is unlikely to consider the possibility that her behavior is far less purposeful.

Problem-solving fixation was examined by a group of German psychologists called Gestalt psychologists during the 1930’s and 1940’s. Karl Dunker, for example, discovered an important type of failure to take a different perspective called  functional fixedness . Imagine being a participant in one of his experiments. You are asked to figure out how to mount two candles on a door and are given an assortment of odds and ends, including a small empty cardboard box and some thumbtacks. Perhaps you have already figured out a solution: tack the box to the door so it forms a platform, then put the candles on top of the box. Most people are able to arrive at this solution. Imagine a slight variation of the procedure, however. What if, instead of being empty, the box had matches in it? Most people given this version of the problem do not arrive at the solution given above. Why? Because it seems to people that when the box contains matches, it already has a function; it is a matchbox. People are unlikely to consider a new function for an object that already has a function. This is functional fixedness.

Mental set is a type of fixation in which the problem solver gets stuck using the same solution strategy that has been successful in the past, even though the solution may no longer be useful. It is commonly seen when students do math problems for homework. Often, several problems in a row require the reapplication of the same solution strategy. Then, without warning, the next problem in the set requires a new strategy. Many students attempt to apply the formerly successful strategy on the new problem and therefore cannot come up with a correct answer.

The thing to remember is that you cannot solve a problem unless you correctly identify what it is to begin with (initial state) and what you want the end result to be (goal state). That may mean looking at the problem from a different angle and representing it in a new way. The correct representation does not guarantee a successful solution, but it certainly puts you on the right track.

A bit more optimistically, the Gestalt psychologists discovered what may be considered the opposite of fixation, namely  insight . Sometimes the solution to a problem just seems to pop into your head. Wolfgang Kohler examined insight by posing many different problems to chimpanzees, principally problems pertaining to their acquisition of out-of-reach food. In one version, a banana was placed outside of a chimpanzee’s cage and a short stick inside the cage. The stick was too short to retrieve the banana, but was long enough to retrieve a longer stick also located outside of the cage. This second stick was long enough to retrieve the banana. After trying, and failing, to reach the banana with the shorter stick, the chimpanzee would try a couple of random-seeming attempts, react with some apparent frustration or anger, then suddenly rush to the longer stick, the correct solution fully realized at this point. This sudden appearance of the solution, observed many times with many different problems, was termed insight by Kohler.

Lest you think it pertains to chimpanzees only, Karl Dunker demonstrated that children also solve problems through insight in the 1930s. More importantly, you have probably experienced insight yourself. Think back to a time when you were trying to solve a difficult problem. After struggling for a while, you gave up. Hours later, the solution just popped into your head, perhaps when you were taking a walk, eating dinner, or lying in bed.

fixation :  when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

functional fixedness :  a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

mental set :  a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

insight :  a sudden realization of a solution to a problem

Solving Problems by Trial and Error

Correctly identifying the problem and your goal for a solution is a good start, but recall the psychologist’s definition of a problem: it includes a set of possible intermediate states. Viewed this way, a problem can be solved satisfactorily only if one can find a path through some of these intermediate states to the goal. Imagine a fairly routine problem, finding a new route to school when your ordinary route is blocked (by road construction, for example). At each intersection, you may turn left, turn right, or go straight. A satisfactory solution to the problem (of getting to school) is a sequence of selections at each intersection that allows you to wind up at school.

If you had all the time in the world to get to school, you might try choosing intermediate states randomly. At one corner you turn left, the next you go straight, then you go left again, then right, then right, then straight. Unfortunately, trial and error will not necessarily get you where you want to go, and even if it does, it is not the fastest way to get there. For example, when a friend of ours was in college, he got lost on the way to a concert and attempted to find the venue by choosing streets to turn onto randomly (this was long before the use of GPS). Amazingly enough, the strategy worked, although he did end up missing two out of the three bands who played that night.

Trial and error is not all bad, however. B.F. Skinner, a prominent behaviorist psychologist, suggested that people often behave randomly in order to see what effect the behavior has on the environment and what subsequent effect this environmental change has on them. This seems particularly true for the very young person. Picture a child filling a household’s fish tank with toilet paper, for example. To a child trying to develop a repertoire of creative problem-solving strategies, an odd and random behavior might be just the ticket. Eventually, the exasperated parent hopes, the child will discover that many of these random behaviors do not successfully solve problems; in fact, in many cases they create problems. Thus, one would expect a decrease in this random behavior as a child matures. You should realize, however, that the opposite extreme is equally counterproductive. If the children become too rigid, never trying something unexpected and new, their problem solving skills can become too limited.

Effective problem solving seems to call for a happy medium that strikes a balance between using well-founded old strategies and trying new ground and territory. The individual who recognizes a situation in which an old problem-solving strategy would work best, and who can also recognize a situation in which a new untested strategy is necessary is halfway to success.

Solving Problems with Algorithms and Heuristics

For many problems there is a possible strategy available that will guarantee a correct solution. For example, think about math problems. Math lessons often consist of step-by-step procedures that can be used to solve the problems. If you apply the strategy without error, you are guaranteed to arrive at the correct solution to the problem. This approach is called using an  algorithm , a term that denotes the step-by-step procedure that guarantees a correct solution. Because algorithms are sometimes available and come with a guarantee, you might think that most people use them frequently. Unfortunately, however, they do not. As the experience of many students who have struggled through math classes can attest, algorithms can be extremely difficult to use, even when the problem solver knows which algorithm is supposed to work in solving the problem. In problems outside of math class, we often do not even know if an algorithm is available. It is probably fair to say, then, that algorithms are rarely used when people try to solve problems.

Because algorithms are so difficult to use, people often pass up the opportunity to guarantee a correct solution in favor of a strategy that is much easier to use and yields a reasonable chance of coming up with a correct solution. These strategies are called  problem solving heuristics . Similar to what you saw in section 6.2 with reasoning heuristics, a problem solving heuristic is a shortcut strategy that people use when trying to solve problems. It usually works pretty well, but does not guarantee a correct solution to the problem. For example, one problem solving heuristic might be “always move toward the goal” (so when trying to get to school when your regular route is blocked, you would always turn in the direction you think the school is). A heuristic that people might use when doing math homework is “use the same solution strategy that you just used for the previous problem.”

By the way, we hope these last two paragraphs feel familiar to you. They seem to parallel a distinction that you recently learned. Indeed, algorithms and problem-solving heuristics are another example of the distinction between Type 1 thinking and Type 2 thinking.

Although it is probably not worth describing a large number of specific heuristics, two observations about heuristics are worth mentioning. First, heuristics can be very general or they can be very specific, pertaining to a particular type of problem only. For example, “always move toward the goal” is a general strategy that you can apply to countless problem situations. On the other hand, “when you are lost without a functioning gps, pick the most expensive car you can see and follow it” is specific to the problem of being lost. Second, all heuristics are not equally useful. One heuristic that many students know is “when in doubt, choose c for a question on a multiple-choice exam.” This is a dreadful strategy because many instructors intentionally randomize the order of answer choices. Another test-taking heuristic, somewhat more useful, is “look for the answer to one question somewhere else on the exam.”

You really should pay attention to the application of heuristics to test taking. Imagine that while reviewing your answers for a multiple-choice exam before turning it in, you come across a question for which you originally thought the answer was c. Upon reflection, you now think that the answer might be b. Should you change the answer to b, or should you stick with your first impression? Most people will apply the heuristic strategy to “stick with your first impression.” What they do not realize, of course, is that this is a very poor strategy (Lilienfeld et al, 2009). Most of the errors on exams come on questions that were answered wrong originally and were not changed (so they remain wrong). There are many fewer errors where we change a correct answer to an incorrect answer. And, of course, sometimes we change an incorrect answer to a correct answer. In fact, research has shown that it is more common to change a wrong answer to a right answer than vice versa (Bruno, 2001).

The belief in this poor test-taking strategy (stick with your first impression) is based on the  confirmation bias   (Nickerson, 1998; Wason, 1960). You first saw the confirmation bias in Module 1, but because it is so important, we will repeat the information here. People have a bias, or tendency, to notice information that confirms what they already believe. Somebody at one time told you to stick with your first impression, so when you look at the results of an exam you have taken, you will tend to notice the cases that are consistent with that belief. That is, you will notice the cases in which you originally had an answer correct and changed it to the wrong answer. You tend not to notice the other two important (and more common) cases, changing an answer from wrong to right, and leaving a wrong answer unchanged.

Because heuristics by definition do not guarantee a correct solution to a problem, mistakes are bound to occur when we employ them. A poor choice of a specific heuristic will lead to an even higher likelihood of making an error.

algorithm :  a step-by-step procedure that guarantees a correct solution to a problem

problem solving heuristic :  a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

confirmation bias :  people’s tendency to notice information that confirms what they already believe

An Effective Problem-Solving Sequence

You may be left with a big question: If algorithms are hard to use and heuristics often don’t work, how am I supposed to solve problems? Robert Sternberg (1996), as part of his theory of what makes people successfully intelligent (Module 8) described a problem-solving sequence that has been shown to work rather well:

  • Identify the existence of a problem.  In school, problem identification is often easy; problems that you encounter in math classes, for example, are conveniently labeled as problems for you. Outside of school, however, realizing that you have a problem is a key difficulty that you must get past in order to begin solving it. You must be very sensitive to the symptoms that indicate a problem.
  • Define the problem.  Suppose you realize that you have been having many headaches recently. Very likely, you would identify this as a problem. If you define the problem as “headaches,” the solution would probably be to take aspirin or ibuprofen or some other anti-inflammatory medication. If the headaches keep returning, however, you have not really solved the problem—likely because you have mistaken a symptom for the problem itself. Instead, you must find the root cause of the headaches. Stress might be the real problem. For you to successfully solve many problems it may be necessary for you to overcome your fixations and represent the problems differently. One specific strategy that you might find useful is to try to define the problem from someone else’s perspective. How would your parents, spouse, significant other, doctor, etc. define the problem? Somewhere in these different perspectives may lurk the key definition that will allow you to find an easier and permanent solution.
  • Formulate strategy.  Now it is time to begin planning exactly how the problem will be solved. Is there an algorithm or heuristic available for you to use? Remember, heuristics by their very nature guarantee that occasionally you will not be able to solve the problem. One point to keep in mind is that you should look for long-range solutions, which are more likely to address the root cause of a problem than short-range solutions.
  • Represent and organize information.  Similar to the way that the problem itself can be defined, or represented in multiple ways, information within the problem is open to different interpretations. Suppose you are studying for a big exam. You have chapters from a textbook and from a supplemental reader, along with lecture notes that all need to be studied. How should you (represent and) organize these materials? Should you separate them by type of material (text versus reader versus lecture notes), or should you separate them by topic? To solve problems effectively, you must learn to find the most useful representation and organization of information.
  • Allocate resources.  This is perhaps the simplest principle of the problem solving sequence, but it is extremely difficult for many people. First, you must decide whether time, money, skills, effort, goodwill, or some other resource would help to solve the problem Then, you must make the hard choice of deciding which resources to use, realizing that you cannot devote maximum resources to every problem. Very often, the solution to problem is simply to change how resources are allocated (for example, spending more time studying in order to improve grades).
  • Monitor and evaluate solutions.  Pay attention to the solution strategy while you are applying it. If it is not working, you may be able to select another strategy. Another fact you should realize about problem solving is that it never does end. Solving one problem frequently brings up new ones. Good monitoring and evaluation of your problem solutions can help you to anticipate and get a jump on solving the inevitable new problems that will arise.

Please note that this as  an  effective problem-solving sequence, not  the  effective problem solving sequence. Just as you can become fixated and end up representing the problem incorrectly or trying an inefficient solution, you can become stuck applying the problem-solving sequence in an inflexible way. Clearly there are problem situations that can be solved without using these skills in this order.

Additionally, many real-world problems may require that you go back and redefine a problem several times as the situation changes (Sternberg et al. 2000). For example, consider the problem with Mary’s daughter one last time. At first, Mary did represent the problem as one of defiance. When her early strategy of pleading and threatening punishment was unsuccessful, Mary began to observe her daughter more carefully. She noticed that, indeed, her daughter’s attention would be drawn by an irresistible distraction or book. Fresh with a re-representation of the problem, she began a new solution strategy. She began to remind her daughter every few minutes to stay on task and remind her that if she is ready before it is time to leave, she may return to the book or other distracting object at that time. Fortunately, this strategy was successful, so Mary did not have to go back and redefine the problem again.

Pick one or two of the problems that you listed when you first started studying this section and try to work out the steps of Sternberg’s problem solving sequence for each one.

a mental representation of a category of things in the world

an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

knowledge about one’s own cognitive processes; thinking about your thinking

individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

Thinking like a scientist in your everyday life for the purpose of drawing correct conclusions. It entails skepticism; an ability to identify biases, distortions, omissions, and assumptions; and excellent deductive and inductive reasoning, and problem solving skills.

a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

an inclination, tendency, leaning, or prejudice

a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

a set of statements in which the beginning statements lead to a conclusion

an argument for which true beginning statements guarantee that the conclusion is true

a type of reasoning in which we make judgments about likelihood from sets of evidence

an inductive argument in which the beginning statements lead to a conclusion that is probably true

fast, automatic, and emotional thinking

slow, effortful, and logical thinking

a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

udging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

noticing, comprehending and forming a mental conception of a problem

when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

a sudden realization of a solution to a problem

a step-by-step procedure that guarantees a correct solution to a problem

The tendency to notice and pay attention to information that confirms your prior beliefs and to ignore information that disconfirms them.

a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

Introduction to Psychology Copyright © 2020 by Ken Gray; Elizabeth Arnott-Hill; and Or'Shaundra Benson is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

APS

The Process of Problem Solving

  • Editor's Choice
  • Experimental Psychology
  • Problem Solving

reasoning and problem solving in cognitive psychology

In a 2013 article published in the Journal of Cognitive Psychology , Ngar Yin Louis Lee (Chinese University of Hong Kong) and APS William James Fellow Philip N. Johnson-Laird (Princeton University) examined the ways people develop strategies to solve related problems. In a series of three experiments, the researchers asked participants to solve series of matchstick problems.

In matchstick problems, participants are presented with an array of joined squares. Each square in the array is comprised of separate pieces. Participants are asked to remove a certain number of pieces from the array while still maintaining a specific number of intact squares. Matchstick problems are considered to be fairly sophisticated, as there is generally more than one solution, several different tactics can be used to complete the task, and the types of tactics that are appropriate can change depending on the configuration of the array.

Louis Lee and Johnson-Laird began by examining what influences the tactics people use when they are first confronted with the matchstick problem. They found that initial problem-solving tactics were constrained by perceptual features of the array, with participants solving symmetrical problems and problems with salient solutions faster. Participants frequently used tactics that involved symmetry and salience even when other solutions that did not involve these features existed.

To examine how problem solving develops over time, the researchers had participants solve a series of matchstick problems while verbalizing their problem-solving thought process. The findings from this second experiment showed that people tend to go through two different stages when solving a series of problems.

People begin their problem-solving process in a generative manner during which they explore various tactics — some successful and some not. Then they use their experience to narrow down their choices of tactics, focusing on those that are the most successful. The point at which people begin to rely on this newfound tactical knowledge to create their strategic moves indicates a shift into a more evaluative stage of problem solving.

In the third and last experiment, participants completed a set of matchstick problems that could be solved using similar tactics and then solved several problems that required the use of novel tactics.  The researchers found that participants often had trouble leaving their set of successful tactics behind and shifting to new strategies.

From the three studies, the researchers concluded that when people tackle a problem, their initial moves may be constrained by perceptual components of the problem. As they try out different tactics, they hone in and settle on the ones that are most efficient; however, this deduced knowledge can in turn come to constrain players’ generation of moves — something that can make it difficult to switch to new tactics when required.

These findings help expand our understanding of the role of reasoning and deduction in problem solving and of the processes involved in the shift from less to more effective problem-solving strategies.

Reference Louis Lee, N. Y., Johnson-Laird, P. N. (2013). Strategic changes in problem solving. Journal of Cognitive Psychology, 25 , 165–173. doi: 10.1080/20445911.2012.719021

' src=

good work for other researcher

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines .

Please login with your APS account to comment.

reasoning and problem solving in cognitive psychology

Careers Up Close: Joel Anderson on Gender and Sexual Prejudices, the Freedoms of Academic Research, and the Importance of Collaboration

Joel Anderson, a senior research fellow at both Australian Catholic University and La Trobe University, researches group processes, with a specific interest on prejudice, stigma, and stereotypes.

reasoning and problem solving in cognitive psychology

Experimental Methods Are Not Neutral Tools

Ana Sofia Morais and Ralph Hertwig explain how experimental psychologists have painted too negative a picture of human rationality, and how their pessimism is rooted in a seemingly mundane detail: methodological choices. 

APS Fellows Elected to SEP

In addition, an APS Rising Star receives the society’s Early Investigator Award.

Privacy Overview

Complex cognition: the science of human reasoning, problem-solving, and decision-making

  • Published: 23 March 2010
  • Volume 11 , pages 99–102, ( 2010 )

Cite this article

  • Markus Knauff 1 &
  • Ann G. Wolf 1  

19k Accesses

37 Citations

Explore all metrics

Avoid common mistakes on your manuscript.

Climate change, globalization, policy of peace, and financial market crises—often we are faced with very complex problems. In order to tackle these complex problems, the responsible people should first come to mutual terms. An additional challenge is that typically the involved parties have different (often conflicting) interests and relate the problems to different emotions and wishes. These factors certainly do not ease the quest for a solution to these complex problems.

It is needless to say that the big problems of our time are not easy to solve. Less clear, however, is identifying the causes that led to these problems. Interest conflicts between social groups, the economic and social system or greed—one can think of many responsible factors for the large-scale problems we are currently confronted with.

The present “Special Corner: complex cognition” deals with questions in this regard that have often received little consideration. Under the headline “complex cognition”, we summarize mental activities such as thinking, reasoning, problem - solving, and decision - making that typically rely on the combination and interaction of more elementary processes such as perception, learning, memory, emotion, etc. (cf. Sternberg and Ben-Zeev 2001 ). However, even though complex cognition relies on these elementary functions, the scope of complex cognition research goes beyond the isolated analysis of such elementary mental processes. Two aspects are essential for “complex cognition”: The first aspect refers to the interaction of different mental activities such as perception, memory, learning, reasoning, emotion, etc. The second aspect takes the complexity of the situation into account an agent is confronted with. Based on these two aspects, the term “complex cognition” can be defined in the following way:

Complex psychological processes: We talk about “complex cognition”, when thinking, problem-solving, or decision-making falls back on other cognitive processes such as “perception”, “working memory”, “long-term memory”, “executive processes”, or when the cognitive processes are in close connection with other processes such as “emotion” and “motivation”. The complexity also results from an interaction from a multitude of processes that occur simultaneously or at different points in time and can be realized in different cognitive and/or neuronal structures.

Complex conditions: We also talk about “complex cognition” when the conditions are complex in which a person finds himself and in which conclusions need to be drawn, a problem needs to be solved, or decisions need to be made. The complexity of the conditions or constraints can have different causes. The situation structure itself can be difficult to “see”, or the action alternatives are difficult “to put into effect”. The conditions can themselves comprise of many different variables. These variables can exhibit a high level of interdependence and cross-connection, and it can, as time passes by, come to a change of the original conditions (e.g. Dörner and Wearing 1995 ; Osman 2010 ). It can also be the case that the problem is embedded in a larger social context and can be solved only under certain specifications (norms, data, legislations, culture, etc.) or that the problem can only be solved in interaction with other agents, be it other persons or technical systems.

When one summarizes these two aspects, this yields the following view of what should be understood as “complex cognition”.

As “complex cognition” we define all mental processes that are used by individuals for deriving new information out of given information, with the intention to solve problems, make decision, and plan actions. The crucial characteristic of “complex cognition” is that it takes place under complex conditions in which a multitude of cognitive processes interact with one another or with other noncognitive processes.

The “Special Corner: complex cognition” deals with complex cognition from many different perspectives. The typical questions of all contributions are: Does the design of the human mind enable the necessary thinking skills to solve the truly complex problems we are faced with? Where lay the boundaries of our thinking skills? How do people derive at conclusions? What makes a problem a complex problem? How can we improve our skills to effectively solve problems and make sound judgements?

It is for sure too much to expect that the Special Corner answers these questions. If it were that easy, we would not be still searching for an answer. It is, however, our intention with the current collection of articles to bring to focus such questions to a larger extent than has been done so far.

An important starting point is the fact that people’s skills to solve the most complex of all problems and to ponder about the most complex issues is often immense—humankind would not otherwise be there were she is now. Yet, on the other hand, it has become more clear in the past few years that often people drift away from what one would identify as “rational” (Kahneman 2003 ). People hardly ever adhere to that what the norms of logic, the probability calculus, or the mathematical decision theory state. For example, most people (and organizations) typically accept more losses for a potential high gain than would be the case if they were to take into account the rules of the probability theory. Similarly, they draw conclusions from received information in a way that is not according to the rules of logic. When people, for example, accept the rule “If it rains, then the street is wet”, they most often conclude that when the street is wet, it must have rained. That, however, is incorrect from a logical perspective: perhaps a cleaning car just drove by. In psychology, two main views are traditionally put forward to explain how such deviations from the normative guidelines occur. One scientific stream is interested in how deviations from the normative models can be explained (Evans 2005 ; Johnson-Laird 2008 ; Knauff 2007 ; Reason 1990 ). According to this line of research, deviations are caused by the limitations of the human cognitive system. The other psychological stream puts forward as the main criticism that the deviations can actually be regarded as mistakes (Gigerenzer 2008 ). The deviations accordingly have a high value, because they are adjusted to the information structure of the environment (Gigerenzer et al. 1999 ). They have probably developed during evolution, because they could ensure survival as for example the specifications of formal logic (Hertwig and Herzog 2009 ). We, the editors of the special corner, are very pleased that we can offer an impression of this debate with the contributions from Marewski, Gaissmaier, and Gigerenzer and the commentaries to this contribution from Evans and Over. Added to this is a reply from Marewski, Gaissmaier, and Gigerenzer to the commentary from Evans and Over.

Another topic in the area of complex cognition can be best illustrated by means of the climate protection. To be successful in this area, the responsible actors have to consider a multitude of ecological, biological, geological, political, and economical factors, the basic conditions are constantly at change, and the intervention methods are not clear. Because the necessary information is not readily available for the person dealing with the problem, the person is forced to obtain the relevant information from other sources. Furthermore, intervention in the complex variable structure of the climate can trigger processes whose impact was likely not intended. Finally, the system will not “wait” for intervention of the actors but will change itself over time. The special corner is also concerned with thinking and problem-solving in such complex situations. The article by Funke gives an overview of the current state of research on this topic from the viewpoint of the author, in which several research areas are covered that have internationally not received much acknowledgement (but see, for example, Osman 2010 ).

Although most contributions to the special corner come from the area of psychology, the contribution by Ragni and Löffler illustrates that computer science can provide a valuable addition to the understanding of complex cognition. Computer science plays an important role in complex cognition. In general, computer science, which is used to investigate computational processes central to all research approaches, can be placed in a “computational theory of cognition” framework. This is true especially for the development of computational theories of complex cognitive processes. In many of our modern knowledge domains, the application of simulations and modelling has become a major part of the methods inventory. Simulations help forecast the weather and climate change, help govern traffic flow and help comprehend physical processes. Although modelling in these areas is a vastly established method, it has been very little applied in the area of human thinking (but see e.g. Anderson 1990 ; Gray 2007 ). However, exactly in the area of complex cognition, the method of cognitive modelling offers empirical research an additional methodological access to the description and explanation of complex cognitive processes. While the validity of psychological theories can be tested with the use of empirical research, cognitive models, with their internal coherence, make possible to test consistency and completeness (e.g. Schmid 2008 ). They will also lead to new hypotheses that will in turn be possible to test experimentally. The contribution of Ragni and Löffler demonstrates with the help of an interesting example, finding the optimal route, the usefulness of simulation and modelling in psychology.

A further problem in the area of complex cognition is that many problems are solvable only under certain social conditions (norms, values, laws, culture) or only in interaction with other actors (cf. Beller 2008 ). The article on deontic reasoning by Beller is concerned with this topic. Deontic reasoning is thinking about whether actions are forbidden or allowed, obligatory or not obligatory. Beller proposes that social norms, imposing constraints on individual actions, constitute the fundamental concept for deontic thinking and that people reason from such norms flexibly according to deontic core principles. The review paper shows how knowing what in a certain situation is allowed or forbidden can influence how people derive at conclusions.

The article of Waldmann, Meder, von Sydow, and Hagmayer is concerned with the important topic of causal reasoning. More specifically, the authors explore the interaction between category and causal induction in causal model learning. The paper is a good example of how experimental work in psychology can combine different research traditions that typically work quite isolated. The paper goes beyond a divide and conquers approach and shows that causal knowledge plays an important role in learning, categorization, perception, decision-making, problem-solving, and text comprehension. In each of these fields, separate theories have been developed to investigate the role of causal knowledge. The first author of the paper is internationally well known for his work on the role of causality in other cognitive functions, in particular in categorization and learning (e.g. Lagnado et al. 2007 ; Waldmann et al. 1995 ). In a number of experimental studies, Waldmann and his colleagues have shown that people when learning about causal relations do not simply form associations between causes and effects but make use of abstract prior assumptions about the underlying causal structure and functional form (Waldmann 2007 ).

We, the guest editors, are very pleased that we have the opportunity with this Special corner to make accessible the topic “complex cognition” to the interdisciplinary readership of Cognitive Processing . We predict a bright future for this topic. The research topic possesses high research relevance in the area of basic research for a multitude of disciplines, for example psychology, computer science, and neuroscience. In addition, this area forms a good foundation for an interdisciplinary cooperation.

A further important reason for the positive development of the area is that the relevance of the area goes beyond fundamental research. In that way, the results of the area can for example also contribute to better understanding of the possibilities and borders of human thinking, problem-solving, and decisions in politics, corporations, and economy. In the long term, it might even lead to practical directions on how to avoid “mistakes” and help us better understand the global challenges of our time—Climate change, globalization, financial market crises, etc.

We thank all the authors for their insightful and inspiring contributions, a multitude of reviewers for their help, the editor-in-chief Marta Olivetti Belardinelli that she gave us the opportunity to address this topic, and the editorial manager, Thomas Hünefeldt, for his support for accomplishing the Special Corner. We wish the readers of the Special Corner lots of fun with reading the contributions!

Anderson JR (1990) The adaptive character of thought. Erlbaum, Hillsdale

Google Scholar  

Beller S (2008) Deontic norms, deontic reasoning, and deontic conditionals. Think Reason 14:305–341

Article   Google Scholar  

Dörner D, Wearing A (1995) Complex problem solving: toward a (computer-simulated) theory. In: Frensch PA, Funke J (eds) Complex problem solving: the European perspective. Lawrence Erlbaum Associates, Hillsdale, pp 65–99

Evans JSBT (2005) Deductive reasoning. In: Holyoak KJ, Morrison RG (eds) The Cambridge handbook of thinking and reasoning. Cambridge University Press, Cambridge, pp 169–184

Gigerenzer G (2008) Rationality for mortals: how people cope with uncertainty. Oxford University Press, Oxford

Gigerenzer G, Todd PM, The ABC Research Group (1999) Simple heuristics that make us smart. Oxford University Press, New York

Gray WD (2007) Integrated models of cognitive systems. Oxford University Press, Oxford

Hertwig R, Herzog SM (2009) Fast and frugal heuristics: tools of social rationality. Soc Cogn 27:661–698

Johnson-Laird PN (2008) Mental models and deductive reasoning. In: Rips L, Adler J (eds) Reasoning: studies in human inference and its foundations. Cambridge University Press, Cambridge, pp 206–222

Kahneman D (2003) A perspective on judgment and choice: mapping bounded rationality. Am Psychol 58:697–720

Article   PubMed   Google Scholar  

Knauff M (2007) How our brains reason logically. Topio 26:19–36

Lagnado DA, Waldmann MR, Hagmayer Y, Sloman SA (2007) Beyond covariation: cues to causal structure. In: Gopnik A, Schulz L (eds) Causal learning: psychology, philosophy, and computation. Oxford University Press, Oxford, pp 154–172

Osman M (2010) Controlling uncertainty: a review of human behavior in complex dynamic environments. Psychol Bull 136(1):65–86

Reason J (1990) Human error. Cambridge University Press, Cambridge

Schmid U (2008) Cognition and AI. KI 08/1, Themenheft “Kognition’’, pp 5–7

Sternberg RJ, Ben-Zeev T (2001) Complex cognition: the psychology of human thought. Oxford University Press, New York

Waldmann MR (2007) Combining versus analyzing multiple causes: how domain assumptions and task context affect integration rules. Cogn Sci 31:233–256

Waldmann MR, Holyoak KJ, Fratianne A (1995) Causal models and the acquisition of category structure. J Exp Psychol Gen 124:181–206

Download references

Author information

Authors and affiliations.

University of Giessen, Giessen, Germany

Markus Knauff & Ann G. Wolf

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Markus Knauff .

Rights and permissions

Reprints and permissions

About this article

Knauff, M., Wolf, A.G. Complex cognition: the science of human reasoning, problem-solving, and decision-making. Cogn Process 11 , 99–102 (2010). https://doi.org/10.1007/s10339-010-0362-z

Download citation

Received : 10 March 2010

Accepted : 10 March 2010

Published : 23 March 2010

Issue Date : May 2010

DOI : https://doi.org/10.1007/s10339-010-0362-z

Thinking and Intelligence

Introduction to thinking and problem-solving, what you’ll learn to do: describe cognition and problem-solving strategies.

A man sitting down in "The Thinker" pose.

Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. This is only one facet of the complex processes involved in cognition. Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

Learning Objectives

  • Distinguish between concepts and prototypes
  • Explain the difference between natural and artificial concepts
  • Describe problem solving strategies, including algorithms and heuristics
  • Explain some common roadblocks to effective problem solving

Contribute!

Improve this page Learn More

  • Modification, adaptation, and original content. Provided by : Lumen Learning. License : CC BY: Attribution
  • What Is Cognition?. Authored by : OpenStax College. Located at : https://openstax.org/books/psychology-2e/pages/7-1-what-is-cognition . License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction
  • A Thinking Man Image. Authored by : Wesley Nitsckie. Located at : https://www.flickr.com/photos/nitsckie/5507777269 . License : CC BY-SA: Attribution-ShareAlike

Footer Logo Lumen Waymaker

7.3 Problem-Solving

Learning objectives.

By the end of this section, you will be able to:

  • Describe problem solving strategies
  • Define algorithm and heuristic
  • Explain some common roadblocks to effective problem solving

   People face problems every day—usually, multiple problems throughout the day. Sometimes these problems are straightforward: To double a recipe for pizza dough, for example, all that is required is that each ingredient in the recipe be doubled. Sometimes, however, the problems we encounter are more complex. For example, say you have a work deadline, and you must mail a printed copy of a report to your supervisor by the end of the business day. The report is time-sensitive and must be sent overnight. You finished the report last night, but your printer will not work today. What should you do? First, you need to identify the problem and then apply a strategy for solving the problem.

The study of human and animal problem solving processes has provided much insight toward the understanding of our conscious experience and led to advancements in computer science and artificial intelligence. Essentially much of cognitive science today represents studies of how we consciously and unconsciously make decisions and solve problems. For instance, when encountered with a large amount of information, how do we go about making decisions about the most efficient way of sorting and analyzing all the information in order to find what you are looking for as in visual search paradigms in cognitive psychology. Or in a situation where a piece of machinery is not working properly, how do we go about organizing how to address the issue and understand what the cause of the problem might be. How do we sort the procedures that will be needed and focus attention on what is important in order to solve problems efficiently. Within this section we will discuss some of these issues and examine processes related to human, animal and computer problem solving.

PROBLEM-SOLVING STRATEGIES

   When people are presented with a problem—whether it is a complex mathematical problem or a broken printer, how do you solve it? Before finding a solution to the problem, the problem must first be clearly identified. After that, one of many problem solving strategies can be applied, hopefully resulting in a solution.

Problems themselves can be classified into two different categories known as ill-defined and well-defined problems (Schacter, 2009). Ill-defined problems represent issues that do not have clear goals, solution paths, or expected solutions whereas well-defined problems have specific goals, clearly defined solutions, and clear expected solutions. Problem solving often incorporates pragmatics (logical reasoning) and semantics (interpretation of meanings behind the problem), and also in many cases require abstract thinking and creativity in order to find novel solutions. Within psychology, problem solving refers to a motivational drive for reading a definite “goal” from a present situation or condition that is either not moving toward that goal, is distant from it, or requires more complex logical analysis for finding a missing description of conditions or steps toward that goal. Processes relating to problem solving include problem finding also known as problem analysis, problem shaping where the organization of the problem occurs, generating alternative strategies, implementation of attempted solutions, and verification of the selected solution. Various methods of studying problem solving exist within the field of psychology including introspection, behavior analysis and behaviorism, simulation, computer modeling, and experimentation.

A problem-solving strategy is a plan of action used to find a solution. Different strategies have different action plans associated with them (table below). For example, a well-known strategy is trial and error. The old adage, “If at first you don’t succeed, try, try again” describes trial and error. In terms of your broken printer, you could try checking the ink levels, and if that doesn’t work, you could check to make sure the paper tray isn’t jammed. Or maybe the printer isn’t actually connected to your laptop. When using trial and error, you would continue to try different solutions until you solved your problem. Although trial and error is not typically one of the most time-efficient strategies, it is a commonly used one.

   Another type of strategy is an algorithm. An algorithm is a problem-solving formula that provides you with step-by-step instructions used to achieve a desired outcome (Kahneman, 2011). You can think of an algorithm as a recipe with highly detailed instructions that produce the same result every time they are performed. Algorithms are used frequently in our everyday lives, especially in computer science. When you run a search on the Internet, search engines like Google use algorithms to decide which entries will appear first in your list of results. Facebook also uses algorithms to decide which posts to display on your newsfeed. Can you identify other situations in which algorithms are used?

A heuristic is another type of problem solving strategy. While an algorithm must be followed exactly to produce a correct result, a heuristic is a general problem-solving framework (Tversky & Kahneman, 1974). You can think of these as mental shortcuts that are used to solve problems. A “rule of thumb” is an example of a heuristic. Such a rule saves the person time and energy when making a decision, but despite its time-saving characteristics, it is not always the best method for making a rational decision. Different types of heuristics are used in different types of situations, but the impulse to use a heuristic occurs when one of five conditions is met (Pratkanis, 1989):

  • When one is faced with too much information
  • When the time to make a decision is limited
  • When the decision to be made is unimportant
  • When there is access to very little information to use in making the decision
  • When an appropriate heuristic happens to come to mind in the same moment

Working backwards is a useful heuristic in which you begin solving the problem by focusing on the end result. Consider this example: You live in Washington, D.C. and have been invited to a wedding at 4 PM on Saturday in Philadelphia. Knowing that Interstate 95 tends to back up any day of the week, you need to plan your route and time your departure accordingly. If you want to be at the wedding service by 3:30 PM, and it takes 2.5 hours to get to Philadelphia without traffic, what time should you leave your house? You use the working backwards heuristic to plan the events of your day on a regular basis, probably without even thinking about it.

Another useful heuristic is the practice of accomplishing a large goal or task by breaking it into a series of smaller steps. Students often use this common method to complete a large research project or long essay for school. For example, students typically brainstorm, develop a thesis or main topic, research the chosen topic, organize their information into an outline, write a rough draft, revise and edit the rough draft, develop a final draft, organize the references list, and proofread their work before turning in the project. The large task becomes less overwhelming when it is broken down into a series of small steps.

Further problem solving strategies have been identified (listed below) that incorporate flexible and creative thinking in order to reach solutions efficiently.

Additional Problem Solving Strategies :

  • Abstraction – refers to solving the problem within a model of the situation before applying it to reality.
  • Analogy – is using a solution that solves a similar problem.
  • Brainstorming – refers to collecting an analyzing a large amount of solutions, especially within a group of people, to combine the solutions and developing them until an optimal solution is reached.
  • Divide and conquer – breaking down large complex problems into smaller more manageable problems.
  • Hypothesis testing – method used in experimentation where an assumption about what would happen in response to manipulating an independent variable is made, and analysis of the affects of the manipulation are made and compared to the original hypothesis.
  • Lateral thinking – approaching problems indirectly and creatively by viewing the problem in a new and unusual light.
  • Means-ends analysis – choosing and analyzing an action at a series of smaller steps to move closer to the goal.
  • Method of focal objects – putting seemingly non-matching characteristics of different procedures together to make something new that will get you closer to the goal.
  • Morphological analysis – analyzing the outputs of and interactions of many pieces that together make up a whole system.
  • Proof – trying to prove that a problem cannot be solved. Where the proof fails becomes the starting point or solving the problem.
  • Reduction – adapting the problem to be as similar problems where a solution exists.
  • Research – using existing knowledge or solutions to similar problems to solve the problem.
  • Root cause analysis – trying to identify the cause of the problem.

The strategies listed above outline a short summary of methods we use in working toward solutions and also demonstrate how the mind works when being faced with barriers preventing goals to be reached.

One example of means-end analysis can be found by using the Tower of Hanoi paradigm . This paradigm can be modeled as a word problems as demonstrated by the Missionary-Cannibal Problem :

Missionary-Cannibal Problem

Three missionaries and three cannibals are on one side of a river and need to cross to the other side. The only means of crossing is a boat, and the boat can only hold two people at a time. Your goal is to devise a set of moves that will transport all six of the people across the river, being in mind the following constraint: The number of cannibals can never exceed the number of missionaries in any location. Remember that someone will have to also row that boat back across each time.

Hint : At one point in your solution, you will have to send more people back to the original side than you just sent to the destination.

The actual Tower of Hanoi problem consists of three rods sitting vertically on a base with a number of disks of different sizes that can slide onto any rod. The puzzle starts with the disks in a neat stack in ascending order of size on one rod, the smallest at the top making a conical shape. The objective of the puzzle is to move the entire stack to another rod obeying the following rules:

  • 1. Only one disk can be moved at a time.
  • 2. Each move consists of taking the upper disk from one of the stacks and placing it on top of another stack or on an empty rod.
  • 3. No disc may be placed on top of a smaller disk.

reasoning and problem solving in cognitive psychology

  Figure 7.02. Steps for solving the Tower of Hanoi in the minimum number of moves when there are 3 disks.

reasoning and problem solving in cognitive psychology

Figure 7.03. Graphical representation of nodes (circles) and moves (lines) of Tower of Hanoi.

The Tower of Hanoi is a frequently used psychological technique to study problem solving and procedure analysis. A variation of the Tower of Hanoi known as the Tower of London has been developed which has been an important tool in the neuropsychological diagnosis of executive function disorders and their treatment.

GESTALT PSYCHOLOGY AND PROBLEM SOLVING

As you may recall from the sensation and perception chapter, Gestalt psychology describes whole patterns, forms and configurations of perception and cognition such as closure, good continuation, and figure-ground. In addition to patterns of perception, Wolfgang Kohler, a German Gestalt psychologist traveled to the Spanish island of Tenerife in order to study animals behavior and problem solving in the anthropoid ape.

As an interesting side note to Kohler’s studies of chimp problem solving, Dr. Ronald Ley, professor of psychology at State University of New York provides evidence in his book A Whisper of Espionage  (1990) suggesting that while collecting data for what would later be his book  The Mentality of Apes (1925) on Tenerife in the Canary Islands between 1914 and 1920, Kohler was additionally an active spy for the German government alerting Germany to ships that were sailing around the Canary Islands. Ley suggests his investigations in England, Germany and elsewhere in Europe confirm that Kohler had served in the German military by building, maintaining and operating a concealed radio that contributed to Germany’s war effort acting as a strategic outpost in the Canary Islands that could monitor naval military activity approaching the north African coast.

While trapped on the island over the course of World War 1, Kohler applied Gestalt principles to animal perception in order to understand how they solve problems. He recognized that the apes on the islands also perceive relations between stimuli and the environment in Gestalt patterns and understand these patterns as wholes as opposed to pieces that make up a whole. Kohler based his theories of animal intelligence on the ability to understand relations between stimuli, and spent much of his time while trapped on the island investigation what he described as  insight , the sudden perception of useful or proper relations. In order to study insight in animals, Kohler would present problems to chimpanzee’s by hanging some banana’s or some kind of food so it was suspended higher than the apes could reach. Within the room, Kohler would arrange a variety of boxes, sticks or other tools the chimpanzees could use by combining in patterns or organizing in a way that would allow them to obtain the food (Kohler & Winter, 1925).

While viewing the chimpanzee’s, Kohler noticed one chimp that was more efficient at solving problems than some of the others. The chimp, named Sultan, was able to use long poles to reach through bars and organize objects in specific patterns to obtain food or other desirables that were originally out of reach. In order to study insight within these chimps, Kohler would remove objects from the room to systematically make the food more difficult to obtain. As the story goes, after removing many of the objects Sultan was used to using to obtain the food, he sat down ad sulked for a while, and then suddenly got up going over to two poles lying on the ground. Without hesitation Sultan put one pole inside the end of the other creating a longer pole that he could use to obtain the food demonstrating an ideal example of what Kohler described as insight. In another situation, Sultan discovered how to stand on a box to reach a banana that was suspended from the rafters illustrating Sultan’s perception of relations and the importance of insight in problem solving.

Grande (another chimp in the group studied by Kohler) builds a three-box structure to reach the bananas, while Sultan watches from the ground.  Insight , sometimes referred to as an “Ah-ha” experience, was the term Kohler used for the sudden perception of useful relations among objects during problem solving (Kohler, 1927; Radvansky & Ashcraft, 2013).

Solving puzzles.

   Problem-solving abilities can improve with practice. Many people challenge themselves every day with puzzles and other mental exercises to sharpen their problem-solving skills. Sudoku puzzles appear daily in most newspapers. Typically, a sudoku puzzle is a 9×9 grid. The simple sudoku below (see figure) is a 4×4 grid. To solve the puzzle, fill in the empty boxes with a single digit: 1, 2, 3, or 4. Here are the rules: The numbers must total 10 in each bolded box, each row, and each column; however, each digit can only appear once in a bolded box, row, and column. Time yourself as you solve this puzzle and compare your time with a classmate.

How long did it take you to solve this sudoku puzzle? (You can see the answer at the end of this section.)

   Here is another popular type of puzzle (figure below) that challenges your spatial reasoning skills. Connect all nine dots with four connecting straight lines without lifting your pencil from the paper:

Did you figure it out? (The answer is at the end of this section.) Once you understand how to crack this puzzle, you won’t forget.

   Take a look at the “Puzzling Scales” logic puzzle below (figure below). Sam Loyd, a well-known puzzle master, created and refined countless puzzles throughout his lifetime (Cyclopedia of Puzzles, n.d.).

A puzzle involving a scale is shown. At the top of the figure it reads: “Sam Loyds Puzzling Scales.” The first row of the puzzle shows a balanced scale with 3 blocks and a top on the left and 12 marbles on the right. Below this row it reads: “Since the scales now balance.” The next row of the puzzle shows a balanced scale with just the top on the left, and 1 block and 8 marbles on the right. Below this row it reads: “And balance when arranged this way.” The third row shows an unbalanced scale with the top on the left side, which is much lower than the right side. The right side is empty. Below this row it reads: “Then how many marbles will it require to balance with that top?”

What steps did you take to solve this puzzle? You can read the solution at the end of this section.

Pitfalls to problem solving.

   Not all problems are successfully solved, however. What challenges stop us from successfully solving a problem? Albert Einstein once said, “Insanity is doing the same thing over and over again and expecting a different result.” Imagine a person in a room that has four doorways. One doorway that has always been open in the past is now locked. The person, accustomed to exiting the room by that particular doorway, keeps trying to get out through the same doorway even though the other three doorways are open. The person is stuck—but she just needs to go to another doorway, instead of trying to get out through the locked doorway. A mental set is where you persist in approaching a problem in a way that has worked in the past but is clearly not working now.

Functional fixedness is a type of mental set where you cannot perceive an object being used for something other than what it was designed for. During the Apollo 13 mission to the moon, NASA engineers at Mission Control had to overcome functional fixedness to save the lives of the astronauts aboard the spacecraft. An explosion in a module of the spacecraft damaged multiple systems. The astronauts were in danger of being poisoned by rising levels of carbon dioxide because of problems with the carbon dioxide filters. The engineers found a way for the astronauts to use spare plastic bags, tape, and air hoses to create a makeshift air filter, which saved the lives of the astronauts.

   Researchers have investigated whether functional fixedness is affected by culture. In one experiment, individuals from the Shuar group in Ecuador were asked to use an object for a purpose other than that for which the object was originally intended. For example, the participants were told a story about a bear and a rabbit that were separated by a river and asked to select among various objects, including a spoon, a cup, erasers, and so on, to help the animals. The spoon was the only object long enough to span the imaginary river, but if the spoon was presented in a way that reflected its normal usage, it took participants longer to choose the spoon to solve the problem. (German & Barrett, 2005). The researchers wanted to know if exposure to highly specialized tools, as occurs with individuals in industrialized nations, affects their ability to transcend functional fixedness. It was determined that functional fixedness is experienced in both industrialized and nonindustrialized cultures (German & Barrett, 2005).

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. For example, let’s say you and three friends wanted to rent a house and had a combined target budget of $1,600. The realtor shows you only very run-down houses for $1,600 and then shows you a very nice house for $2,000. Might you ask each person to pay more in rent to get the $2,000 home? Why would the realtor show you the run-down houses and the nice house? The realtor may be challenging your anchoring bias. An anchoring bias occurs when you focus on one piece of information when making a decision or solving a problem. In this case, you’re so focused on the amount of money you are willing to spend that you may not recognize what kinds of houses are available at that price point.

The confirmation bias is the tendency to focus on information that confirms your existing beliefs. For example, if you think that your professor is not very nice, you notice all of the instances of rude behavior exhibited by the professor while ignoring the countless pleasant interactions he is involved in on a daily basis. Hindsight bias leads you to believe that the event you just experienced was predictable, even though it really wasn’t. In other words, you knew all along that things would turn out the way they did. Representative bias describes a faulty way of thinking, in which you unintentionally stereotype someone or something; for example, you may assume that your professors spend their free time reading books and engaging in intellectual conversation, because the idea of them spending their time playing volleyball or visiting an amusement park does not fit in with your stereotypes of professors.

Finally, the availability heuristic is a heuristic in which you make a decision based on an example, information, or recent experience that is that readily available to you, even though it may not be the best example to inform your decision . Biases tend to “preserve that which is already established—to maintain our preexisting knowledge, beliefs, attitudes, and hypotheses” (Aronson, 1995; Kahneman, 2011). These biases are summarized in the table below.

Were you able to determine how many marbles are needed to balance the scales in the figure below? You need nine. Were you able to solve the problems in the figures above? Here are the answers.

The first puzzle is a Sudoku grid of 16 squares (4 rows of 4 squares) is shown. Half of the numbers were supplied to start the puzzle and are colored blue, and half have been filled in as the puzzle’s solution and are colored red. The numbers in each row of the grid, left to right, are as follows. Row 1: blue 3, red 1, red 4, blue 2. Row 2: red 2, blue 4, blue 1, red 3. Row 3: red 1, blue 3, blue 2, red 4. Row 4: blue 4, red 2, red 3, blue 1.The second puzzle consists of 9 dots arranged in 3 rows of 3 inside of a square. The solution, four straight lines made without lifting the pencil, is shown in a red line with arrows indicating the direction of movement. In order to solve the puzzle, the lines must extend beyond the borders of the box. The four connecting lines are drawn as follows. Line 1 begins at the top left dot, proceeds through the middle and right dots of the top row, and extends to the right beyond the border of the square. Line 2 extends from the end of line 1, through the right dot of the horizontally centered row, through the middle dot of the bottom row, and beyond the square’s border ending in the space beneath the left dot of the bottom row. Line 3 extends from the end of line 2 upwards through the left dots of the bottom, middle, and top rows. Line 4 extends from the end of line 3 through the middle dot in the middle row and ends at the right dot of the bottom row.

   Many different strategies exist for solving problems. Typical strategies include trial and error, applying algorithms, and using heuristics. To solve a large, complicated problem, it often helps to break the problem into smaller steps that can be accomplished individually, leading to an overall solution. Roadblocks to problem solving include a mental set, functional fixedness, and various biases that can cloud decision making skills.

References:

Openstax Psychology text by Kathryn Dumper, William Jenkins, Arlene Lacombe, Marilyn Lovett and Marion Perlmutter licensed under CC BY v4.0. https://openstax.org/details/books/psychology

Review Questions:

1. A specific formula for solving a problem is called ________.

a. an algorithm

b. a heuristic

c. a mental set

d. trial and error

2. Solving the Tower of Hanoi problem tends to utilize a  ________ strategy of problem solving.

a. divide and conquer

b. means-end analysis

d. experiment

3. A mental shortcut in the form of a general problem-solving framework is called ________.

4. Which type of bias involves becoming fixated on a single trait of a problem?

a. anchoring bias

b. confirmation bias

c. representative bias

d. availability bias

5. Which type of bias involves relying on a false stereotype to make a decision?

6. Wolfgang Kohler analyzed behavior of chimpanzees by applying Gestalt principles to describe ________.

a. social adjustment

b. student load payment options

c. emotional learning

d. insight learning

7. ________ is a type of mental set where you cannot perceive an object being used for something other than what it was designed for.

a. functional fixedness

c. working memory

Critical Thinking Questions:

1. What is functional fixedness and how can overcoming it help you solve problems?

2. How does an algorithm save you time and energy when solving a problem?

Personal Application Question:

1. Which type of bias do you recognize in your own decision making processes? How has this bias affected how you’ve made decisions in the past and how can you use your awareness of it to improve your decisions making skills in the future?

anchoring bias

availability heuristic

confirmation bias

functional fixedness

hindsight bias

problem-solving strategy

representative bias

trial and error

working backwards

Answers to Exercises

algorithm:  problem-solving strategy characterized by a specific set of instructions

anchoring bias:  faulty heuristic in which you fixate on a single aspect of a problem to find a solution

availability heuristic:  faulty heuristic in which you make a decision based on information readily available to you

confirmation bias:  faulty heuristic in which you focus on information that confirms your beliefs

functional fixedness:  inability to see an object as useful for any other use other than the one for which it was intended

heuristic:  mental shortcut that saves time when solving a problem

hindsight bias:  belief that the event just experienced was predictable, even though it really wasn’t

mental set:  continually using an old solution to a problem without results

problem-solving strategy:  method for solving problems

representative bias:  faulty heuristic in which you stereotype someone or something without a valid basis for your judgment

trial and error:  problem-solving strategy in which multiple solutions are attempted until the correct one is found

working backwards:  heuristic in which you begin to solve a problem by focusing on the end result

Creative Commons License

Share This Book

  • Increase Font Size

No internet connection.

All search filters on the page have been cleared., your search has been saved..

  • All content
  • Dictionaries
  • Encyclopedias
  • Sign in to my profile No Name

Not Logged In

  • Sign in Signed in
  • My profile No Name

Not Logged In

  • Business & Management
  • Counseling & Psychotherapy
  • Criminology & Criminal Justice
  • Geography, Earth & Environmental Science
  • Health & Social Care
  • Media, Communication & Cultural Studies
  • Politics & International Relations
  • Social Work
  • Information for instructors
  • Information for librarians
  • Information for students and researchers

reasoning and problem solving in cognitive psychology

The Psychology of Thinking

  • By: John Paul Minda
  • Publisher: SAGE Publications Ltd
  • Publication year: 2015
  • Online pub date: February 06, 2019
  • Discipline: Psychology
  • Subject: Thinking, Reasoning & Problem-solving , Cognitive Psychology (general) , Consumer Behavior
  • DOI: https:// doi. org/10.4135/9781473920262
  • Print ISBN: 9781446272473
  • Online ISBN: 9781473920262
  • Buy the book icon link

Subject index

How do we define thinking? Is it simply memory, perception and motor activity or perhaps something more complex such as reasoning and decision making? This book argues that thinking is an intricate mix of all these things and a very specific coordination of cognitive resources. Divided into three key sections, there are chapters on the organization of human thought, general reasoning and thinking and behavioural outcomes of thinking. These three overarching themes provide a broad theoretical framework with which to explore wider issues in cognition and cognitive psychology and there are chapters on motivation and language plus a strong focus on problem solving, reasoning and decision making - all of which are central to a solid understanding of this field. The book also explores the cognitive processes behind perception and memory, how we might differentiate expertise from skilled, competent performance and the interaction between language, culture and thought.

Front Matter

  • Section 1: THE ORGANIZATION OF HUMAN THOUGHT
  • Chapter 1: The Psychology of Thinking
  • Chapter 2: The Psychology of Similarity
  • Chapter 3: Knowledge and Memory
  • Chapter 4: Concepts and Categories
  • Chapter 5: Language and Thought
  • Section 2: THINKING AND REASONING
  • Chapter 6: Inference and Induction
  • Chapter 7: Deductive Reasoning
  • Chapter 8: Context, Motivation, and Mood
  • Section 3: THINKING IN ACTION: DECISION-MAKING, PROBLEM-SOLVING, AND EXPERTISE
  • Chapter 9: Decision-Making
  • Chapter 10: Problem-Solving
  • Chapter 11: Expertise and Expert Thinking

Back Matter

Sign in to access this content, get a 30 day free trial, more like this, sage recommends.

We found other relevant content for you on other Sage platforms.

Have you created a personal profile? Login or create a profile so that you can save clips, playlists and searches

  • Sign in/register

Navigating away from this page will delete your results

Please save your results to "My Self-Assessments" in your profile before navigating away from this page.

Sign in to my profile

Sign up for a free trial and experience all Sage Learning Resources have to offer.

You must have a valid academic email address to sign up.

Get off-campus access

  • View or download all content my institution has access to.

Sign up for a free trial and experience all Sage Knowledge has to offer.

  • view my profile
  • view my lists

Reasoning & Problem Solving

Judith Ellen Fan

Judith Ellen Fan

Tobias Gerstenberg

Tobias Gerstenberg

Noah Goodman

Noah Goodman

Hyowon Gweon

Hyowon Gweon

Mark Lepper

Mark Lepper

Jay McClelland

Jay McClelland

Barbara Tversky

Barbara Tversky

Phd students.

Adani Abutto

Adani Abutto

grey Stanford tree

Sean Anderson

Catherine Garton

Catherine Garton

Satchel Grant

Satchel Grant

Jerome Han

Yuxuan (Effie) Li

Andrew (Joo Hun) Nam

Andrew (Joo Hun) Nam

Eric Neumann

Eric Neumann

Kate Petrova

Kate Petrova

Ben Prystawski

Ben Prystawski

Sarah Wu

Peter Guandi Zhu

Research staff.

Kylie Yorke

Kylie Yorke

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Media
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business Ethics
  • Business History
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic History
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Politics and Law
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Cognitive Psychology

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Cognitive Psychology

40 Reasoning

Jonathan St. B. T. Evans, School of Psychology, University of Plymouth, Plymouth, UK

  • Published: 03 June 2013
  • Cite Icon Cite
  • Permissions Icon Permissions

This chapter covers the traditional study of deductive reasoning together with a change in the paradigm that has occurred in recent years. The traditional method involves asking logically untrained participants to assume the truth of premises and draw logically necessary conclusions. Contrary to initial expectations most people perform quite poorly on these tasks making many logical errors and exhibiting systematic biases. Moreover, reasoning turns out to be highly dependent on content and context. While this work produced a wealth of psychological findings and theories, authors have begun to change their methodological and theoretical approach in order to focus on how people reasoning with uncertain and personal beliefs. Many now consider Bayesiansim to be a more appropriate normative and computational model of reasoning than classical logic. Dual-process theories of reasoning have adapted to this paradigm shift and feature prominently in the current literature.

The psychology of reasoning is usually taken to refer to the tradition of studying deduction. The standard paradigm for this field, which has been with us since the start of the 20th century, is based upon formal logic: Participants are tested for their ability, without formal training, to judge correctly the validity of arguments. There are, of course, other kinds of human inference to be studied, especially inductive and causal reasoning, which are the topics of Chapters 30 and 46 in this volume. Even the deductive reasoning tradition has, however, diversified in recent years to such an extent that many of the researchers in this field have moved away from the traditional paradigm. Many now think of reasoning as probabilistic or pragmatic and have adapted their methods of study as well as their theories to accommodate this change of perspective. Dual-process theories distinguishing fast, intuitive processes from slow, reflective ones have also become popular. The standard paradigm is still used, however, and often adopted when the intention is to engage participants in effortful and difficult reasoning.

The move away from the standard deduction paradigm means the distinction between this tradition and that of studying inductive reasoning appears no longer conceptually clear (but see Evans & Over, in press), at least conceptually. However, the literatures for these traditions remain largely separate, and so I will review work in both the traditional and new paradigms of the psychology of reasoning within this chapter. The simplest way to present this to the reader is to describe first the traditional paradigm, with its aims, methods, theories, and findings. I will then show how the paradigm has been shifting in the past 15 years or so and discuss changes in methods and theories as well as new findings that have emerged as a result.

The Traditional Paradigm: Deductive Reasoning

The psychology of deductive reasoning has a relatively long history with studies published well before World War II (Wilkins, 1928 ; Woodworth & Sells, 1935 ). Indeed, these early studies established one of the key methods still in use—syllogistic reasoning—as well as introducing the idea that logical errors are caused by cognitive biases. In more recent times the field has been active since the 1960s (see Wason & Johnson-Laird, 1972 ) and has gradually expanded up to the present day. In common with the study of decision making, and in contrast with most other fields of cognitive psychology, the study of reasoning has long been associated with an interest in normative rationality. Indeed, the principal aims of the standard paradigm were (a) to establish by experimental observation the extent to which people are competent in logical reasoning and (b) to describe theoretically how naïve participants are able to perform deductive reasoning. As much logical error was observed, an additional objective became (c) to describe the cognitive biases that interfere with the process of logical reasoning.

The qualification to (b) that theories should focus on naïve participants is of key importance. It is standard practice to avoid using participants with training in logic and few studies in the traditional paradigm provide instruction on key logical concepts. This reflects a long philosophical tradition in which it is proposed that logic is the foundation of rational thinking (Henle, 1962 ) as well as the once very influential claim of Jean Piaget that ordinary people develop sophisticated logical reasoning at the stage of formal operations (Inhelder & Piaget, 1958 ). Hence, the ability to reason logically should not depend upon any kind of training. The modern psychology of reasoning (from the 1960s onward) was thus founded upon both normative logicism (the belief that logic was the standard for rational thought) and the search for descriptive logicism (the belief that naïve participants reason using logic).

Highly influential in the development of the modern field was the British psychologist Peter Wason, who invented several ingenious tasks for studying reasoning and published extensively on the topic during the 1960s and 1970s. It was principally Wason who founded objective (c)—the study of cognitive biases in reasoning. As I have detailed elsewhere (Evans, 2002 ), Wason accepted normative logicism while strongly challenging the descriptive logicism of Piaget. In other words, he accepted that people should be logical in their reasoning, while declaring that they were frequently illogical. This suggests that people are irrational, as Wason indeed asserted. However, such arguments were strongly disputed by those convinced that people must be inherently rational and logical (Cohen, 1981 ; Henle, 1962 ). While the debate about rationality in reasoning has persisted to the present day, Wason’s contribution in establishing evidence of cognitive biases is equally enduring.

The principal method for studying deductive reasoning is to ask people to evaluate logical arguments. For example, a syllogism (a simple logical system devised by Aristotle) consists of the two premises and a conclusion, which link together three terms such as:

No A are B. Some C are B. Therefore, some A are not C. (1)

In many studies participants have been shown abstract syllogisms like these and asked to decide whether their conclusion necessarily follows from the premises. To my knowledge, however, the only study to have presented every possible syllogism for evaluation is that of Evans, Handley, Harper, and Johnson-Laird ( 1999 ). There are many possible syllogisms, as there are four types of statements used (all, no, some, some not) which can be presented in either premises or conclusion, as well as four different arrangements of the terms A, B, and C. The great majority of these are logically invalid, meaning that their conclusions do not necessarily follow. However, the predominant tendency is for participants to say that conclusions do, in fact, follow so that they endorse many fallacies (see Evans, Newstead, & Byrne, 1993 ). For example, argument 1 is a fallacy, but 63% of participants said that it necessarily followed from the premises in the study of Evans et al. ( 1999 ). Syllogistic arguments can also be presented with semantically rich content about which people have prior beliefs, for example:

No millionaires are hard workers. Some rich people are hard workers. Therefore, some millionaires are not rich people. (2)

Such content may have a dramatic effect on responding. Although argument 2 has the same logical form as 1, only 10% of participants endorsed its conclusion in the study of Evans, Barston, and Pollard ( 1983 ). With this particular content, the fallacy becomes transparent.

The syllogistic method can be varied by presenting only premises and asking participants to draw their own conclusions (e.g., Johnson-Laird & Bara, 1984 ); we can term this “the production method,” as opposed to the standard evaluation method. It turns out that this change can substantially affect the patterns of reasoning observed (Morley, Evans, & Handley, 2004 ). Evaluation and production methods can also be applied to other types of logical argument, such as those involved in relational and propositional reasoning. A commonly used method to study deduction is that of conditional inference (see Table 40.1 ). As with syllogisms, these inferences can be presented using the evaluation or production method and with abstract or realistic materials. In the table, I illustrate the evaluation method with abstract materials in the examples. In the deduction paradigm, standard logic is applied so that two of these inferences (MP, MT) are deemed valid and two others (DA, AC) invalid. It is commonly observed that MP is endorsed more frequently than MT, and that both fallacies are often endorsed. As with syllogistic reasoning, therefore, logical accuracy is quite poor for naïve participants.

The other main method used to study deductive competence in this tradition is the Wason selection

task (Wason, 1966 ), although this problem requires hypothesis testing as well as logical reasoning. In a typical abstract form, participants are told that a statement applies to four cards, each of which has a letter on one side and a number on the other side. The statement is as follows:

If a card has an A on one side, then it has a 3 on the other side.

The four cards are shown, but obviously with only the upper side visible as follows:

The instruction is then to decide which cards and only which cards need to be turned over in order to decide whether the statement is true or false. The logically correct answer is to choose the A and 7, although only 10%–20% of participants manage to do so, making this an exceptionally hard problem. Typical responses are to choose A and 3, or A alone. Finding a confirming combination on any given card (A and 3) will not prove the statement true of the whole set. Only a falsifying combination (A and not-3) would be decisive. The only two cards that could reveal such a combination are A and 7 (not a 3).

The selection task is also studied in another form, which is both deontic (concerned with rules and regulations) and realistic. An example is the drinking age rule (Griggs & Cox, 1983 ), an adapted version of which is the following:You are a police officer watching people drinking in a bar, in order to check whether the following rule is being obeyed:

If a person is drinking beer, then that person must be over 18 years of age.

The following cards represent four drinkers and show on one side the beverage being drunk, and on the side the age:

Beer Coke 20 years of age 16 years of age

By contrast with the abstract Wason task, this one is very easy to solve. Most people choose to investigate the beer drinker and the underage drinker, who are indeed the only ones who could be breaking the rule.

There are many specific theories and models that have been proposed to explain performance on particular deductive reasoning tasks. In this section, I consider only theories framed at the level of the paradigm as a whole. During the 1970s and 1980s two major frameworks emerged to address aim (b): How can naïve participants engage in deductive reasoning? The first of these is known as the mental logic approach and primarily been applied to conditional reasoning and other forms of propositional inference (Braine & O’Brien, 1998 ; Rips, 1994 ). Some methods used by logicians, such as the truth table analysis to be found in standard logic textbooks, are most unlikely to be the basis for human reasoning. Hence, philosophers have suggested that people might have “natural” logics based upon sets of inference rules, an idea first introduced to psychologists by Braine ( 1978 ). The various theories of this type have in common the idea that some forms of deductive reasoning are quick, direct, and largely error free, while others are slow, indirect, and error prone. The former occurs where people have direct rules of inference stored in their minds, such as Modus Ponens, which is endorsed almost 100% of the time when presented using the standard method (Evans et al., 1993 ). The much greater difficulty of the valid Modus Tollens (MT) inference (endorsed about 60% of the time by university students) is attributed to the fact that there is no direct rule in the mental logic. Instead, they suggest that MT requires difficult and indirect reduction and absurdum reasoning. Consider the following:

If there is an A on the card, then there is a 4 on the card. There is not a 4 on the card. Therefore, there is not an A on the card.

According to mental logicians, people have no rule of inference in their minds corresponding to this. However, they can solve the problem by a thought experiment of the following kind. Imagine that there is an A on the card. What follows? There must be a 4. But there is not a 4. Hence, the supposition that there is an A must be false. So there cannot be an A on the card, or else we would have a contradiction. This more difficult reasoning is error prone, hence the lower endorsement rate of MT.

An important alternative to this account emerged when Johnson-Laird developed his mental model theory of deduction (Johnson-Laird, 1983 ; Chapter 41, this volume; Johnson-Laird & Byrne, 1991 ). Indeed this theory dominated the deduction paradigm up until the mid 1990s and the development of the new paradigm, and it is still influential today. In this account people represent premises and conclusion as mental models. These models are (primarily) semantic, meaning that they represent possible situations in the real world. Hence, they are also truth verifiable so that a particular mental model is either true or false in the actual world (for a theory based on epistemic mental models, which represent beliefs about the world, see Evans, 2007 ). The basic mechanism for reasoning with mental models is the following:

Formulate a mental model of the premises.

Check whether the conclusion is true in the model (or generate a conclusion consistent with the model).

Test the putative conclusion by searching for counterexamples to it; that is, seek a mental model that is consistent with the premises but not the conclusion.

Put more simply, the theory proposed that deductive competence is founded not on a set of inferences rules but on a simple semantic principle: An inference is valid if there is no counterexample to it. The theory was originally applied to syllogistic reasoning (Johnson-Laird & Bara, 1984 ) and later to conditional reasoning (see Johnson-Laird & Byrne, 2002 , for the most recent version), as well as a number of other deduction tasks. Unlike the mental logic theory, which had to add on a variety of explanations for logical errors and biases, the possibility for error was built in from the start in the mental models account. For example, it was assumed that reasoning would be limited by working memory capacity and that therefore reasoning with multiple mental models would be error prone. The theory also provided an alternative account of the MP/MT difference to that of the mental logic theory. It was assumed that a conditional, if p then q, was initially represented with only one explicit model: p & q. However, it was also implicit that there might be models consistent with not-p. In the case of MP, the minor premise p eliminates the implicit model, so q follows immediately. In the case of MT, the minor premise not-q leads to no conclusion unless the participant manages to “flesh out” the implicit mental model not-p & not-q. Reasoning with MT is hence more difficult.

In the 1980s, a major debate appeared between mental logic and mental model theorists with numerous claims and counterclaims published in the literature. As later commentators observed, however, the argument was entirely founded within the traditional, deduction paradigm (Evans & Over, 1996 ; Oaksford & Chater, 1995 ). Both sides agreed that deductive competence was the key to rational reasoning and that it was of paramount importance to understand the mechanisms by which people achieved it. However strong the disagreement, it was confined to the details of the method by which deductive reasoning was achieved. What this debate also obscured was the extent to which observed reasoning in these tasks was not logical at all: There was extensive evidence emerging that reasoning was both highly content dependent and subject to a whole range of cognitive biases.

Working in the tradition started by Peter Wason, I was more concerned to explain the biases observed in reasoning rather than its supposed logicality (aim (c)). I hence developed the heuristic-analytic (H-A) theory of reasoning. In its original form (Evans, 1989 ), the theory explained errors and biases as occurring at a heuristic stage that preceded an analytic stage of reasoning. Heuristic processing led to selective representations of problem content, both by selective attention to problem features and retrieval of relevant knowledge from memory. The idea was that biases would occur if logically relevant information was selected out or logically irrelevant information selected in at the heuristic stage. The mechanism of analytic reasoning was unspecified, although I was more sympathetic to the mental model than the mental logic theory. The H-A theory has recently been radically revised (Evans, 2006 ; 2007 ) and forms part of the new paradigm described below. A key difference is that cognitive biases are now attributed as much to analytic as heuristic processes. This is because people are inclined to accept the first possibility they consider without systematic consideration of alternatives. One inspiration for this is the observation that the search for the counterexamples stage specified in the mental model theory often fails to materialize (Evans et al., 1999 ).

The H-A theory falls within a broader category of dual-process theories and accounts that have been around in the psychology of reasoning since the 1970s (Wason & Evans, 1975 ). These arose originally within the standard paradigm as an attempt to distinguish between processes that underlie logical reasoning from those responsible for cognitive biases often portrayed a kind of conflict (e.g., Evans et al., 1983 ). I will, however, defer discussion of dual-process theory more generally to the later part of the chapter when we look at the shifting paradigm of more recent research.

Study of content-rich versions of the Wason selection task became very popular during the 1980s and 1990s, involving several authors who did not usually work within the deduction paradigm. As a result, a number of theories were introduced that emphasized domain-specific and content-based mechanisms, in contrast with the traditional emphasis on general-purpose reasoning systems. Cheng and Holyoak ( 1985 ) introduced the idea that deontic contexts activate pragmatic reasoning schemas that have been acquired for dealing with situations involving permission and obligation. More controversially, evolutionary psychologists argued that people have innate, modular reasoning systems that apply in particular contexts such as social exchange or hazard avoidance, and reported experiments on the selection task purporting to support these claims (Cosmides, 1989 ; Fiddick, Cosmides, & Tooby, 2000 ; Gigerenzer & Hug, 1992 ). An alternative view presented was that the domain-general pragmatic procedures produce domain-specific effects when applied to particular content (Sperber, Cara, & Girotto, 1995 ). All of these offered explanations of why some content-rich versions of the selection task are easy to solve but were mutually incompatible and led to several published exchanges between rival camps. However, the collective effect of such content-based theories was clearly a factor in undermining the traditions of the deduction paradigm.

The three main tasks studied in the traditional paradigm—syllogistic reasoning, conditional inference, and the Wason selection task—have yielded a wealth of interesting and important findings about human reasoning. None of this is denied by critics of the deduction paradigm: It is simply that these findings seem themselves to undermine its basis. First, there is the finding that logical error rates are very high indeed on these tasks, even though the main participant population tested (university students) is above average in general intelligence. If people were designed to be logical reasoners, how can this be so? The difficulty is exacerbated by the fact that reasoning is shown on these tasks to be highly content and context dependent. But again, logical reasoning requires one to focus on the form of arguments, disregarding their content. These findings also undermined the normative justification for the deduction paradigm. If people, in fact, reason in a pragmatic and probabilistic manner, as the data suggest, then perhaps logic is not as important for real-world reasoning as first thought.

As there are far too many findings to discuss here, I have summarized some of the major ones in Table 40.2 . I will, however, discuss some highlights, focusing mostly on the effect of problem content and context. Consider first the finding that realistic versions of the selection task are much easier to solve than the original abstract form. It was originally assumed that this reflected improved logical reasoning (Wason & Johnson-Laird, 1972 ) as befits thinking within the traditional, logicist paradigm. In the light of the large amount of research that has taken place since on the selection task (see Evans & Over, 2004 , for a recent review), this conclusion is impossible to sustain. Early doubts were raised by the observation that there was no transfer of performance from easy versions like the drinking age problem to subsequently performed abstract tasks (Cox & Griggs, 1982 ). If the former induces a logical insight, why should it then disappear? It soon became clear that pragmatic cues, based on prior experience, directed people’s attention to the relevant cards (e.g., Evans, 1996 ). Some authors have even suggested that some versions of the deontic selection task require no reasoning at all for their solution (Sperber & Girotto, 2002 ). However, studies of individual differences show a moderate relation of cognitive ability to success on the deontic selection task, suggesting some kind of reasoning is required. In addition, only participants of very high IQ seem able to solve the abstract selection task (Newstead, Handley, Harley, Wright, & Farelly, 2004 ; Stanovich & West, 1998 ).

The intense interest in the deontic (and content-rich) selection task between the mid-1980s and 1990s led to discovery of some interesting new phenomena. In particular, a number of different authors showed that manipulating scenarios which framed the decision in different ways had a marked effect on card choices. In particular, changing perspective of the person required to check compliance with the rule could result in an opposite pattern of card choices (Cosmides, 1989 ; Gigerenzer & Hug, 1992 ; Manktelow & Over, 1991 ; Politzer & Nguyen-Xuan, 1992 ). While results were similar, explanations were not. Perspective shifts were variously attributed to pragmatic mechanism, decision-theoretic choices, and innate Darwinian algorithms. To my knowledge, these arguments have never been satisfactorily resolved. However, findings like these certainly helped to undermine the traditional view that performance on reasoning tasks was based on the general logical form of the arguments presented.

There has also been a debate about whether most participants reason at all on the abstract version of the selection task, not including the 10% or so who manage to solve it. Evans and Over ( 2004 ) suggested that card choices were entirely caused by heuristics which prompted attention selectively to the cards usually chosen, a conclusion apparently supported by evidence that people attend only to the cards that they select (Ball, Lucas, Miles, & Gale, 2003 ; Evans, 1996 ). However, a recent reanalysis of these studies shows that while people attend selectively to cards which “match” the lexical content of the conditional statement, their subsequent decision to choose these cards is strongly affected by their logical status. Together with other evidence (Evans & Ball, 2010 ; Feeney & Handley, 2000 ; Handley, Feeney, & Harper, 2002 ; Lucas & Ball, 2005 ), it now seems clear that nonsolvers of the abstract selection task are indeed reasoning about hidden values on the cards. But this also leads us to another important conclusion: We cannot use the normative correctness of participants’ responding as diagnostic evidence of their engagement in reasoning. The detachment of the definition of analytic reasoning from logical performance is an important aspect of the new paradigm, discussed later.

Content-rich versions of the selection task are usually used to demonstrate facilitation of performance. Given the very low base rate of correct choices on the abstract version, it is hard to see that they could do otherwise. However, the effect of content in another major paradigm, syllogistic reasoning, appears to be quite the opposite—a major cause of bias in reasoning. The belief bias effect was first discovered by Wilkins ( 1928 ), but the modern history dates from the paper of Evans et al. ( 1983 ), who established the key phenomena with all relevant controls. The standard method involves presentation of four types of syllogism which differ in their validity as well as the believability of their conclusions: valid-believable, valid-unbelievable, invalid-believable, invalid-unbelievable. Despite clear instructions to reason logically, drawing only necessary conclusions, Evans et al. observed a large belief bias effect, with many more conclusions endorsed with believable than unbelievable conclusions. However, there was an equally large validity effect (valid conclusions preferred) and a belief by logic interaction: more belief bias on invalid arguments. These findings have been replicated many times since and interpreted within the mental model framework (Newstead, Pollard, Evans, & Allen, 1992 ; Oakhill, Johnson-Laird, & Garnham, 1989 ). Recently, the belief bias effect has been modeled in detail by Klauer, Musch, and Naumer ( 2000 ).

The role of content and context has also been studied within the conditional inference paradigm. In contrast with syllogistic reasoning, it is the belief in the major premises (the conditional statement) rather than the conclusion that seems to be the main factor. People are reluctant to draw inferences from conditional statements that they disbelieve, as many studies have shown. This conditional belief effect is, however, less marked when people are given strong deductive reasoning instructions than when simply asked to judge what follows (George, 1995 ; Stevenson & Over, 1995 ; see also Evans, Handley, Neilens, Bacon, & Over, 2010 ). This is one of many findings that led to a greater interest in dual-process accounts in the recent literature, as it seems that there may be an effortful form of reasoning that can be adopted which is more “logical” and less pragmatic in nature. Like belief bias, this conditional belief effect has been studied with some of the recently developed methodologies that I discuss in the second part of the chapter.

To summarize, research with the standard, deduction paradigm led to the discovery of many systematic logical errors and biases, both in abstract reasoning problems and those with semantically rich content. The use of this paradigm led to the labeling of any systematic deviation from the logically correct choice as a “bias,” for example, matching bias and belief bias. However, during the 1990s authors became increasingly uncomfortable with the idea that participants must be poor reasoners, given the real-world intelligence of the human species (e.g. Evans & Over, 1996 ; Oaksford & Chater, 1994 ) and hence questioned the assumption that illogical means irrational. This was one of the foundations for the major paradigm shift that I now discuss.

The Shifting Paradigm

Although some of my colleagues like to talk of the “new” paradigm in the psychology of reasoning, I think that is perhaps a little premature, at least in a strict Kuhnian sense. While there has been much movement away from both the theoretical and methodological constraints of the traditional deductive paradigm, there is as yet no really clear consensus on the alternative. In theoretical terms, there is now much interest in probabilistic and Bayesian models of reasoning, the role of pragmatics in human inference, and the case for dual-process and dual-system theories. The reason that it is difficult to call this a new paradigm is that there is also wide disagreement about the objectives to be followed. Nevertheless, I will follow the same structure as that adopted in the description of the traditional paradigm.

Many contemporary reasoning researchers have dispensed with aims of the traditional deductive paradigm, which has received much criticism within the reasoning community (e.g., Evans, 2002 ; Oaksford & Chater, 1998 ). In particular, there is a lot less interest in whether people reason logically and a lot less belief that it matters. The major aims pursued in contemporary research seem to be the following: (a) to develop alternative normative systems for human reasoning than that of standard logic, and (b) to develop an accurate descriptive account of the actual processes that underlie human reasoning. In pursuit of (b), many have explored the dual-process approach. The reason that the paradigm is not yet integrated is that while the authors who are vigorously pursuing aim (a), such as Oaksford and Chater ( 2007 ), seem less interested in developing a descriptive account, those such as myself who are very interested in (b)—developing descriptive and dual-process accounts—do not necessarily see an important role for normative theory at all (Elqayam & Evans, 2011 ; Evans, 2007 ).

Reasoning researchers no longer rely simply on asking people to assume premises and draw necessary conclusions from logical arguments, or on traditional variants such as the Wason selection task. Where people are asked to draw inferences from premises, this may be done pragmatically, without the usual logical instructions and perhaps with an opportunity for people to rate the degree of believability of the conclusions, rather than its validity. Participants may be asked to judge the persuasiveness of arguments or simply to assess the believability of conditional statements and so on. In addition, a number of auxiliary methods have been developed in recent years to attempt to elucidate the nature of the underlying cognitive processes. In common with all other areas of cognitive psychology, neural imaging studies have been employed to trace the activity of different brain areas during reasoning. A number of methods have also been developed for the study of dual processes in reasoning, including instructional manipulations intended to induce more or less effortful reasoning, the use of speeded tasks and concurrent working memory loads to disrupt such reasoning, and the correlation of reasoning performance with individual differences in cognitive ability and cognitive style. I will illustrate the application of each of these methods by discussion of sample studies in due course.

The 1990s was a period of major theoretical change in the psychology of reasoning. First, there was a revival of interest in dual-process theory, following the parallel publications of Evans and Over ( 1996 ) and Sloman ( 1996 ), and incorporation of dual-process theory into the major individual differences research program of Stanovich and West (Stanovich, 1999 , 2011 ; Stanovich & West, 2000b ). Such theories distinguish between two kinds of process: fast, automatic, and high capacity (type 1) and slow, effortful, and low capacity (type 2). Stronger forms of dual-process theory attribute these to two distinct cognitive systems that evolved separately (Epstein, 1994 ; Evans, 2010 ; Evans & Over, 1996 ; Reber, 1993 ; Stanovich, 1999 ), now usually known as System 1 and 2. Dual-process theories are not confined to the idea that there are two methods of reasoning, but they may also propose two forms of knowledge, two ways of deciding, and a basic distinction between explicit and implicit forms of social knowledge.

Within the standard paradigm, dual-process theories were originally developed to distinguish logical reasoning (type 2) from cognitive biases (type 1). (Attributing these to Systems 1 and 2, respectively, is popular but problematic; see Evans, 2008 ; Keren & Schul, 2009 ). This way of thinking has, however, radically changed. It is still considered that slow, effortful type 2 reasoning is generally necessary for solution of logical reasoning problems, but by no means sufficient. While some authors like to describe type 2 reasoning as “rule based,” it is not some form of mental logic, as contemporary theories attribute a much wider role to System 2 than simply solving deduction problems. It is more common now to think of type 2 reasoning as that which requires central working memory resources (Evans, 2008 ), explaining its slow, sequential, limited capacity nature as well as the correlation with individual differences in cognitive ability. However, nothing in this definition implies that type 2 processes necessarily conform to logic or any other normative system. Type 2 processes can be the cause of biases (Evans, 2007 ; Stanovich, 2011 ), and type 1 processes can cue normatively correct answers, when the pragmatic cues align with the logical structure (e.g., Stanovich & West, 1998 ). Nor is it any longer deemed that type 1 processes are sensitive to context and prior belief, while type 2 processes are abstract. It now appears that both kinds of processing can be influenced by beliefs, albeit in a different way (Verschueren, Schaeken, & d’Ydewalle, 2005 ; Weidenfeld, Oberauer, & Hornig, 2005 ).

The 1990s also saw the emergence of the rational analysis program of Oaksford and Chater ( 1998 ). They attacked logicism and argued for alternative accounts of several reasoning tasks that recast participants’ choices as rational. Examples include the proposal that card choices on the selection task are designed to maximize information about categories (Oaksford & Chater, 1994 ), the probability heuristics model of syllogistic reasoning (Chater & Oaksford, 1999 ), and the proposal that conditional inferences align with the conditional probability of the conclusion given the minor premise (Oaksford, Chater, & Larkin, 2000 ). Recently, these authors have integrated these various accounts within a general Bayesian framework (Oaksford & Chater, 2007 ). It is this program that primarily addresses objective (a) of the new paradigm, as they insist that it is important to replace logic with an alternative normative standard to logic.

My colleagues and I, by contrast, have primarily pursued the objective of finding an accurate description of human reasoning and one that integrates it more broadly with decision making. The new framework incorporates a form of dual-process theory and is known as hypothetical thinking theory or HTT (Evans, 2007 ; Evans, Over, & Handley, 2003 ). The general theory proposes that people conduct mental simulations to examine possibilities in both reasoning and decision making constrained by three principles: (1) singularity (only one possibility or mental model is considered at a time), (2) relevance (that which is most plausible or pragmatically cued by the context), and (3) satisficing (current model is accepted as the basis for inference or action unless there is good reason to give it up). The process model for HTT combines type 1 and 2 processes within a category of dual-process models described as “default-interventionist” (Evans, 2008 ). That is, rapid type 1 processes propose default intuitive responses that may or may not be intervened upon by type 2 processing (similar processing assumptions are made by Stanovich, 2011 , and by Kahneman & Frederick, 2002 ).

A particular instantiation of HTT is the suppositional theory of conditionals (Evans & Over, 2004 ; Evans, Handley, & Over, 2003 , 2005 ), which is strongly indicative of the paradigm shift in the psychology of reasoning. Whereas the mental logic and mental model accounts of conditional inference involve modified versions of the standard propositional calculus of logic, the suppositional theory is inspired by probability logic and the proposal of non-truth functional conditionals by philosophical logicians (Adams, 1998 ; Edgington, 1995 ). However, our purpose was to develop a psychologically plausible theory of ordinary conditionals, rather than to propose an alternative normative system. Inspired by philosophical discussions of the Ramsey test (Edgington, 1995 ), we suggested that ordinary conditionals are used by speakers to provoke mental simulations or thought experiments in the listener. The suppositional conditional is not the material conditional of standard logic and cannot be expressed by the disjunction “either not-p or q.” Consider the following conditional:

If the US economy recovers, then Obama will be re-elected. (3) If this were a material conditional, as in logic textbooks, then it would mean the same as Either the US economy does not recover or Obama will be re-elected. (4)

Statement 4 is clearly true if the economy does not recover or if Obama is reelected. Surely, we would not say the same about 3! According to the suppositional theory of conditionals, deciding the truth of an ordinary conditional like 3 requires a mental simulation of a supposition. We have to imagine a world in which the US economy does recover and use our knowledge of economics and politics to decide the likelihood that Obama will be reelected. If that probability is high, we will find the conditional statement believable. If we think that there is no chance of the economy recovering in time, we would consider 3 to be irrelevant, rather than true.

One of the new methods to have been applied to the psychology of reasoning in recent years is neural imaging by functional magnetic resonance imaging (fMRI) and other methods. Anyone hoping to find an organ in the brain that functions as a mental logic across the various tasks would be disappointed by the findings. In fact, a wide variety of brain regions have been shown to be active in different experiments on reasoning, with no common area consistently present across studies (Goel, 2008 ). However, several findings are broadly supportive of the dual-process theory. It appears that the brain detects conflict when type 1 and 2 processing would lead to different answers, signaled by activation of the anterior cingulate cortex. Only when the default response is overridden by reasoning, however, does a different region, in the right lateral prefrontal cortex, “light up” (De Neys, Vartanian, & Goel, 2008 ; Goel & Dolan, 2003 ; Tsujii & Watanabee, 2009 ). It should be noted, however, that research using these methods is in its infancy as compared with the extensive application of neuroscience methods in other fields of cognitive psychology.

In recent years, interest in the selection task has waned. The study of conditional inference has been the dominant method, with particular interest in the pragmatic factors that influence reasoning with and interpretation of statements with realistic content. There has been particular interest in causal conditionals, in which p represents a cause and q an effect, as well as in a range of “utility” conditionals (Bonnefon, 2009 ), which include those used as advice, inducement, and persuasion. In the case of causal conditionals, there is established work (Cummins, Lubart, Alksnis, & Rist, 1991 ; Thompson, 1994 , 2000 ) showing that beliefs that participants hold about the relations between p and q strongly influence their tendency to endorse all four forms of conditional inference shown in Table 40.1 , even if strict deductive reasoning instructions are employed. One way that this can happen, consistent with the mental model theory, is that people retrieve counterexamples from memory which may lead them, by explicit reasoning, to inhibit an inference they might otherwise make. For example, the Modus Ponens inference might be resisted for a conditional such as “If the ignition key is turned, then the engine will start” because the participant recalls a case where she had a flat battery in her car (a type 2 process). On the other hand, the inference will probably be endorsed if the participant goes on associative strength or conditional probability (type 1) as engines do start most times that the key is turned. There is now evidence that beliefs can influence these inferences in each of these ways (Verschueren et al., 2005 ; Weidenfeld et al., 2005 ).

When people reason in the type 2 manner using counterexamples, it appears that the validity of the inference is relevant. A recent finding is that participants of higher working memory capacity are more able to retrieve counterexamples to all four kinds of conditional inference than those of lower capacity, but that they inhibit these when they would result in the withholding of a valid, rather than fallacious inference (De Neys, Schaeken, & d’Ydewalle, 2005a , 2005b ). The reason is that higher ability participants make more effort to reason deductively in compliance with the instructions set (Newstead et al., 2004 ). However, the inhibition of beliefs in causal conditional reasoning by higher ability participants occurs only when they are given strict deductive rather than pragmatic reasoning instructions (Evans et al., 2010 ). This is consistent with contemporary dual-process theory (e.g., Stanovich, 2011 ). The ability to engage in effective reasoning in compliance with instructions requires both a sufficient level of cognitive ability and a disposition to reason in an analytic and effortful manner.

Recent studies of utility conditionals illustrate the diversification of methods within new paradigm studies of reasoning. Thompson, Evans, and Handley ( 2005 ) looked at conditional persuasions and dissuasions, also known as consequential conditionals. These are statements used in argumentation to persuade people of a point of view, rather than directly to cause an action. Such conditionals are often used by politicians, for example, “If the UK builds more nuclear power stations, then carbon emissions will be reduced.” The assumption here is that reducing carbon emissions is a good thing, due to global warming fears, and that therefore building nuclear power stations is a good thing, too. Opponents of nuclear power would draw our attention to other, negative consequences, such as the problems of disposing of radioactive waste. Thompson et al. studied such statements using a wide variety of methods, not just conditional inference. Participants were, for example, quite easily able to infer the motives of the speaker, and to judge the inferences they intended the listener to draw, even when their own actual inferences were quite different. In studies such as these, conditionals are studied as illocutionary acts of communication, intended to influence the beliefs and actions of the listener. This is far removed from older studies that tested the people’s ability to understand the logic of conditional statements, based on material implication.

A major class of utility conditionals is inducements (promises, threats) and advice (tips, warnings). One finding is that people draw more conditional inferences, of all four kinds (Table 40.1 ), with inducements than with advice statements (Newstead, Ellis, Evans, & Dennis, 1997 ). This was later shown to be due to the fact with the former, participants believe the speaker to be more in control of the consequence (Evans & Twyman-Musgrove, 1998 ). From the viewpoint of the suppositional conditional theory, this means that P(q|p) is higher. So this is another case where the degree of belief in a conditional statement influences the extent to which people draw inferences from it. If instead we ask people to rate the extent to which conditional advice is useful and likely to be heeded, or the extent to which conditional inducements will be effective, we find again that P(q|p) is a significant factor (Evans, Neilens, Handley, & Over, 2008 ). However, such judgements are also influenced by the costs and benefits associated with both actions and consequences. For example, advice is judged to be good when the cost of p is low, the benefit of q is high, and the likelihood that p leads to q is high.

In direct support of the suppositional and other probabilistic theories of conditionals, recent studies have shown that most people do indeed judge the probability of a conditional statement if p then q, on the basis of the conditional probability, P(q|p), whether the statements are abstract (Evans, Handley, & Over, 2003 ; Oberauer & Wilhelm, 2003 ) or realistic (Evans et al., 2010 ; Over, Hadjichristidis, Evans, Handley, & Sloman, 2007 ). However, the results are not quite straightforward. A sizeable minority instead assign the probability of the conditional statements as the conjunctive probability, P(p&q). It turns out that this is related to cognitive ability: Participants of higher IQ are more suppositional. They are both more likely to give P(q|p) as the probability of if p then q, and also to assign defective truth tables in which not-p cases are described as irrelevant (Evans, Handley, Neilens, & Over, 2007 ; Evans et al., 2010 ; Oberauer, Geiger, Fischer, & Weidenfeld, 2007 ). Barrouillet, Gauffroy, and Lecas ( 2008 ) have linked these findings with developmental trends, in which children are progressively more likely to interpret conditionals in terms of both conditional probability and defective truth tables as they grow older. This apparently conflicts with the account of the defective truth table given by Johnson-Laird and Byrne ( 2002 ). They suggest that people overlook the relevance of not-p cases as they fail to flesh out these implicit possibilities. But this should be more likely to occur in those with lower rather than higher working memory capacity. Recently, Johnson-Laird ( 2010 ) has offered an explanation of the conditional probability finding and other results in the suppositional conditional research programme within his own framework. The suppositional theory has also been critiqued on the basis of modeling conditional inference rates by Oberauer ( 2006 ).

While I have focused here on findings with conditional inference, it should be noted that the new paradigm has also freed reasoning researchers to work on any task which is informative about the cognitive processes that interest them. It is now common for reasoning researchers to make direct theoretical links with tasks on decision making and statistical inference, often studying these alongside traditional reasoning tasks (for examples, see De Neys & Glumicic, 2008 ; Evans, 2007 ; Stanovich & West, 2008b ). Some have branched out into the study of informal reasoning and argumentation, which further blurs the distinction between this tradition and the study of inductive inference (Hahn & Oaksford, 2007 ; Stanovich & West, 2007 , 2008a ).

As has been mentioned several times, contemporary researchers quite commonly use measures of individual differences as either the main factor of interest or as an additional variable. Some researchers use SAT scores, some IQ tests and others measures of working memory capacity, but it makes little difference as all are highly correlated with one another (Colom, Rebollo, Palacios, Juan-Espinosa, & Kyllonen, 2004 ). The study of individual differences in cognitive ability and cognitive style was introduced by Stanovich and West (Stanovich, 1999 ; Stanovich & West, 2000a ) during the 1990s initially with the conclusion that higher ability participants were more able to find normatively correct solutions to reasoning and decision problems. This perhaps led to the unfortunate conclusion that type 1 processes are responsible for biases and type 2 processes for abstract, normative reasoning. However, we know that both kinds of processing can causes biases (Evans, 2007 ), and it is now clear that higher cognitive ability fails to protect people from a range of cognitive biases (Stanovich & West, 2008a , 2008b ). There are also some paradoxical findings as yet unexplained. For example, higher ability participants are better at Modus Ponens reasoning (considered rapid and automatic) and no better (or possibly worse) at solving Modus Tollens (a difficult and effortful inference) (Evans et al., 2007 ; Newstead et al., 2004 ). From the perspective of dual-process theory, this seems to be the wrong way around.

Conclusions

While the psychology of reasoning has its roots in the study of deduction, it is quite clear that the paradigm has been shifting significantly away from interest in logical reasoning in the past 15 years or so. What we inherit from the study of deductive reasoning is substantial, if paradoxical: an understanding that human reasoning is not , in fact, well adapted to dealing with abstract, logical problems. Only participants of high cognitive ability fare at all well with these, and even they are still subject to a range of cognitive biases. Studies within the traditional paradigm showed that reasoning was highly dependent upon content and context (see Table 40.2 ). While such findings were originally interpreted as showing that people’s efforts to reason deductively were subject to biasing influence of knowledge, that perspective has shifted with the paradigm. The new perspective is that people have reasoning systems designed to deal with the pragmatically rich and uncertain nature of the real world, rather than with logical abstractions. Thus, reasoning on the traditional tasks is more “rational” than was previously assumed. At the same time, there is little sense in confining the study of human reasoning to tasks that require people to disregard their beliefs and draw necessary conclusions from arbitrary premises.

While dual-process theories developed during the period of traditional study, their interpretation has been updated with the paradigm shift. Originally, type 1 processing was seen as contextually dependent and the cause of cognitive biases while type 2 processing was seen as abstract and logical, responsible for normative solutions. This is clearly wrong as both kinds of processing may be influenced by context, either may lead to biases, and either may also lead to normatively correct solutions. The only sensible basis for a dual-process theory now seems to be that type 1 processes are rapid and automatic leading to default judgements and inferences, while type 2 processes are slow and effortful, capable of intervening upon these default responses (Evans & Stanovich, in press). Most researchers implicitly or explicitly define type 2 processing by the involvement of working memory. From this assumption, the main methodologies for studying dual processes arise: Type 2 processes are expected selectively to correlate with individual differences in cognitive ability and selectively to be disrupted by concurrent working memory loads or speeded tasks. There is also current interest in dispositional and situational factors that might encourage type 2 processing, including cognitive style, instructional set, and feelings of confidence (Shynkarkuk & Thompson, 2006 ; Stanovich, 2011 ; Thompson, 2009 ; Thompson, Prowse Turner, & Pennycook, 2011 ).

While the shift away from the traditional deductive paradigm has liberated reasoning researchers in both their theories and their methods, it is not entirely clear what the new paradigm is that replaces it. While some authors think it is important to construct an alternative normative model to that of logic to define human reasoning as right or wrong, others see this as unnecessary or undesirable. The new methods, which involve studying reasoning in a pragmatic and probabilistic way, as well as making direct judgements of belief, probability, or argument strength clearly blur the boundaries with other distinct traditions, such as the study of inductive reasoning, argumentation, or persuasion. At the same time, things are not yet fully joined up, as cross-references to these alternative traditions are few and far between. However, explicit linkage with studies of statistical reasoning and decision making is now common and much to be welcomed.

Future Directions

While the paradigm for the psychology of reasoning has clearly shifted away from logic and the traditional deduction paradigm, the rationality debate that engulfed the field in the 1980s is still to be resolved. Do we need an alternative normative framework for the psychology of reasoning, or should we simply focus on descriptive accounts?

The description of reasoning in the new paradigm appears to blur the distinction between induction and deduction. Should inheritors of the deductive tradition make more effort to integrate their work with paradigms (such as category-based induction, rule learning, and causal inference) traditionally studied in separate literature on inductive reasoning?

Dual-process theories that are conceptually similar to those in the psychology of reasoning have been investigated for many years within social psychology, in largely separated literatures (see Evans, 2008 ). Such theories are also attracting interest from other cognitive scientists and philosophers of mind (see the recent collection of papers edited by Evans & Frankish, 2009 ). There is much work to be done in integrating these literatures and placing dual-process theory within a cognitive architecture for the mind as a whole.

Neural imaging and neuropsychological methods have been quite sparingly applied in the psychology of reasoning so far, compared with their extensive use in cognitive psychology generally. We can expect to see a large expansion in this enterprise in the near future.

Research within the past two decades has indicated that the ability to engage in effective analytic reasoning is related to cognitive ability, rational thinking dispositions, and a range of situational variables. The practical implications of these findings, for example, in education, are yet to be worked out in detail.

Acknowledgments

I would like to thank Shira Elqayam for a detailed critical reading of an earlier draft of this chapter.

Adams E. ( 1998 ). A primer of probability logic. Stanford, CA : CLSI publications.

Ball L. J. , Lucas E. J. , Miles J. N. V. , & Gale A. G. ( 2003 ). Inspection times and the selection task: What do eye-movements reveal about relevance effects? Quarterly Journal of Experimental Psychology, 56A, 1053–1077.

Barrouillet P. , Gauffroy C. , & Lecas J. F. ( 2008 ). Mental models and the suppositional account of conditionals. Psychological Review, 115, 760–772.

Bonnefon J. B. ( 2009 ). A theory of utility conditionals: Paralogical reasoning from decision-theoretic leakage. Psychological Review, 118, 888–907.

Braine M. D. S. ( 1978 ). On the relation between the natural logic of reasoning and standard logic.   Psychological Review, 85, 1–21.

Google Scholar

Braine M. D. S. , & O’Brien, D. P. (Eds.). ( 1998 ). Mental logic. Mahwah, NJ : Erlbaum.

Google Preview

Byrne R. M. J. ( 1989 ). Suppressing valid inferences with conditionals.   Cognition, 31, 61–83.

Byrne R. M. J. , Espino O. , & Santamaria C. ( 1999 ). Counterexamples and the suppression of inferences. Journal of Memory and Language, 40, 347–373.

Chater N., & Oaksford, M. ( 1999 ). The probability heuristics model of syllogistic reasoning. Cognitive Psychology, 38, 191–258.

Cheng P. W., & Holyoak, K. J. ( 1985 ). Pragmatic reasoning schemas.   Cognitive Psychology, 17, 391–416.

Cohen L. J. ( 1981 ). Can human irrationality be experimentally demonstrated? Behavioral and Brain Sciences, 4, 317–370.

Colom R. , Rebollo I. , Palacios A. , Juan-Espinosa M. , & Kyllonen P. C. ( 2004 ). Working memory is (almost) perfectly predicted by g.   Intelligence, 32, 277–296.

Cosmides L. ( 1989 ). The logic of social exchange: Has natural selection shaped how humans reason? Cognition, 31, 187–276.

Cox J. R., & Griggs, R. A. ( 1982 ). The effects of experience on performance in Wason’s selection task. Memory and Cognition, 10, 496–502.

Cummins D. D. , Lubart T. , Alksnis O. , & Rist R. ( 1991 ). Conditional reasoning and causation. Memory and Cognition, 19, 274–282.

De Neys W., & Glumicic, T. ( 2008 ). Conflict monitoring in dual process theories of thinking. Cognition, 106, 1248–1299.

De Neys W. , Schaeken W. , & d’Ydewalle G. ( 2005 a). Working memory and counterexample retrieval for causal conditionals. Thinking and Reasoning, 11, 123–150.

De Neys W. , Schaeken W. , & d’Ydewalle G. ( 2005 b). Working memory and everyday conditional reasoning: Retrieval and inhibition of stored counterexamples. Thinking and Reasoning, 11, 349–381.

De Neys W. , Vartanian O. , & Goel V. ( 2008 ). Smarter than we think: When our brains detect that we are biased. Psychological Science, 19, 483–489.

Edgington D. ( 1995 ). On conditionals.   Mind, 104, 235–329.

Elqayam, S. , & Evans, J. St. B. T. ( 2011 ). Subtracting ‘ought from ‘is’: Descriptivism versus normativism in the study of human thinking. Behavioral and Brain Sciences, 34, 233–290.

Epstein S. ( 1994 ). Integration of the cognitive and psychodynamic unconscious. American Psychologist, 49, 709–724.

Evans J. St. B. T. ( 1989 ). Bias in human reasoning: Causes and consequences. Hove, UK: Erlbaum.

Evans J. St. B. T. ( 1996 ). Deciding before you think: Relevance and reasoning in the selection task. British Journal of Psychology, 87, 223–240.

Evans J. St. B. T. ( 1998 ). Matching bias in conditional reasoning: Do we understand it after 25 years? Thinking and Reasoning, 4, 45–82.

Evans J. St. B. T. ( 2002 ). Logic and human reasoning: An assessment of the deduction paradigm. Psychological Bulletin, 128, 978–996.

Evans J. St. B. T. ( 2006 ). The heuristic-analytic theory of reasoning: Extension and evaluation. Psychonomic Bulletin and Review, 13, 378–395.

Evans J. St. B. T. ( 2007 ). Hypothetical thinking: Dual processes in reasoning and judgement. Hove, England: Psychology Press.

Evans J. St. B. T. ( 2008 ). Dual-processing accounts of reasoning, judgment and social cognition. Annual Review of Psychology, 59, 255–278.

Evans J. St. B. T. ( 2010 ). Thinking twice: Two minds in one brain. Oxford, England : Oxford University Press.

Evans J. St. B. T., & Ball, L. J. ( 2010 ). Do people reason on the Wason selection task: A new look at the data of Ball et al. (2003). Quarterly Journal of Experimental Psychology, 63, 434–441.

Evans J. St. B. T. , Barston J. L. , & Pollard P. ( 1983 ). On the conflict between logic and belief in syllogistic reasoning. Memory and Cognition, 11, 295–306.

Evans J. St. B. T. , & Frankish, K. (Eds.). ( 2009 ). In two minds: Dual processes and beyond. Oxford, England : Oxford University Press.

Evans J. St. B. T. , Handley S. , Neilens H. , Bacon A. M. , & Over D. E. ( 2010 ). The influence of cognitive ability and instructional set on causal conditional inference. Quarterly Journal of Experimental Psychology, 63, 892–909.

Evans J. St. B. T. , Handley S. , Neilens H. , & Over D. E. ( 2007 ). Thinking about conditionals: A study of individual differences. Memory and Cognition, 35, 1772–1784.

Evans J. St. B. T. , Handley S. J. , Harper C. , & Johnson-Laird P. N. ( 1999 ). Reasoning about necessity and possibility: A test of the mental model theory of deduction. Journal of Experimental Psychology: Learning, Memory, and Cognition, 25, 1495–1513.

Evans J. St. B. T. , Handley S. J. , & Over D. E. ( 2003 ). Conditionals and conditional probability. Journal of Experimental Psychology: Learning, Memory, and Cognition, 29, 321–355.

Evans J. St. B. T. , Neilens H. , Handley S. , & Over D. E. ( 2008 ). When can we say “if”?   Cognition, 108, 100–116.

Evans J. St. B. T. , Newstead S. E. , & Byrne R. M. J. ( 1993 ). Human reasoning: The psychology of deduction. Hove, England: Erlbaum.

Evans J. St. B. T. , & Over, D. E. ( 1996 ). Rationality and reasoning. Hove, England : Psychology Press.

Evans J. St. B. T. , & Over, D. E. ( 2004 ). If. Oxford, England : Oxford University Press.

Evans J. St. B. T. , Over D. E. , & Handley S. J. ( 2003 ). A theory of hypothetical thinking. In D. Hardman & L. Maachi (Eds.), Thinking: Psychological perspectives on reasoning, judgement and decision making (pp. 3–22). Chichester, England : Wiley.

Evans J. St. B. T. , Over D. E. , & Handley S. J. ( 2005 ). Supposition, extensionality and conditionals: A critique of Johnson-Laird & Byrne (2002). Psychological Review, 112, 1040–1052.

Evans, J. St. B. T. , & Stanovich, K. E. ( in press ). Dual process theories of higher cognition: Advancing the debate. Perspectives on psychological science.

Evans J. St. B. T., & Twyman-Musgrove, J. ( 1998 ). Conditional reasoning with inducements and advice.   Cognition, 69, B11–B16.

Feeney A. , & Handley, S. J. ( 2000 ). The suppression of q card selections: Evidence for deductive inference in Wason’s selection task. Quarterly Journal of Experimental Psychology, 53A, 1224–1243.

Fiddick L. , Cosmides L. , & Tooby J. ( 2000 ). No interpretation without representation: The role of domain-specific representations and inferences in the Wason selection task. Cognition, 77, 1–79.

George C. ( 1995 ). The endorsement of the premises: Assumption-based or belief-based reasoning. British Journal of Psychology, 86, 93–111.

Gigerenzer G., & Hug, K. ( 1992 ). Domain-specific reasoning: Social contracts, cheating and perspective change. Cognition, 43, 127–171.

Goel V. ( 2008 ). Anatomy of deductive reasoning.   Trends in Cognitive Sciences, 11, 435–441.

Goel V., & Dolan, R. J. ( 2003 ). Explaining modulation of reasoning by belief.   Cognition, 87, B11–B22.

Griggs R. A., & Cox, J. R. ( 1983 ). The effects of problem content and negation on Wason’s selection task. Quarterly Journal of Experimental Psychology, 35A, 519–533.

Hahn U., & Oaksford, M. ( 2007 ). The rationality of informal argumentation: A Bayesian approach to reasoning fallacies. Psychological Review, 114, 704–732.

Handley S. J. , Feeney A. , & Harper C. ( 2002 ). Alternative antecedents, probabilities and the suppression of fallacies on Wason’s selection task. Quarterly Journal of Experimental Psychology, 55A, 799–813.

Henle M. ( 1962 ). On the relation between logic and thinking.   Psychological Review, 69, 366–378.

Inhelder B. , & Piaget, J. ( 1958 ). The growth of logical thinking. New York : Basic Books.

Johnson-Laird P. N. ( 1983 ). Mental models. Cambridge, England : Cambridge University Press.

Johnson-Laird P. N. ( 2010 ). The truth about conditionals. In K. I. Manktelow , D. E. Over , & S. Elqayam (Eds.), The science of reason: A Festschrift for Jonathan St B T Evans (pp. 119–144). Hove, England: Psychology Press.

Johnson-Laird P. N., & Bara, B. G. ( 1984 ). Syllogistic inference.   Cognition, 16, 1–61.

Johnson-Laird P. N. , & Byrne, R. M. J. ( 1991 ). Deduction. Hove, England & London : Erlbaum.

Johnson-Laird P. N. , & Byrne, R. M. J. ( 2002 ). Conditionals: A theory of meaning, pragmatics and inference. Psychological Review, 109, 646–678.

Kahneman, D. , & Frederick, S. ( 2002 ). Representativeness revisited: Attribute substitution in intuitive judgement. In T. Gilovich , D. Griffin , & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 49–81). Cambridge: Cambridge University Press.

Keren G. , & Schul, Y. ( 2009 ). Two is not always better than one: A critical evaluation of two-system theories. Perspectives on Psychological Science, 4, 533–550.

Klauer K. C. , Musch J. , & Naumer B. ( 2000 ). On belief bias in syllogistic reasoning.   Psychological Review, 107, 852–884.

Kyllonen P. , & Christal, R. E. ( 1990 ). Reasoning ability is (little more than) working memory capacity!? Intelligence, 14, 389–433.

Lucas E. J. , & Ball, L. J. ( 2005 ). Think-aloud protocols and the selection task: Evidence for relevance effects and rationalisation processes. Thinking and Reasoning, 11, 35–66.

Manktelow K. I. ( 1999 ). Reasoning and thinking. Hove, England: Psychology Press.

Manktelow, K. I. ( 2012 ). Thinking and reasoning . Hove, UK: Psychology Press.

Manktelow K. I. , & Over, D. E. ( 1991 ). Social roles and utilities in reasoning with deontic conditionals. Cognition, 39, 85–105.

Morley N. J. , Evans J. St. B. T. , & Handley S. J. ( 2004 ). Belief bias and figural bias in syllogistic reasoning. Quarterly Journal of Experimental Psychology, 57, 666–692.

Newstead S. E. , Ellis C. , Evans J. St. B. T. , & Dennis I. ( 1997 ). Conditional reasoning with realistic material. Thinking and Reasoning, 3, 49–76.

Newstead S. E. , Handley S. J. , Harley C. , Wright H. , & Farelly D. ( 2004 ). Individual differences in deductive reasoning.   Quarterly Journal of Experimental Psychology, 57A, 33–60.

Newstead S. E. , Pollard P. , Evans J. St. B. T. , & Allen J. L. ( 1992 ). The source of belief bias effects in syllogistic reasoning. Cognition, 45, 257–284.

Oakhill J. , Johnson-Laird P. N. , & Garnham A. ( 1989 ). Believability and syllogistic reasoning.   Cognition, 31, 117–140.

Oaksford M. , & Chater, N. ( 1994 ). A rational analysis of the selection task as optimal data selection. Psychological Review, 101, 608–631.

Oaksford M. , & Chater, N. ( 1995 ). Theories of reasoning and the computational explanation of everyday inference. Thinking and Reasoning, 1, 121–152.

Oaksford M. , & Chater, N. ( 1998 ). Rationality in an uncertain world. Hove, England: Psychology Press.

Oaksford M. , & Chater, N. ( 2007 ). Bayesian rationality. Oxford, England : Oxford University Press.

Oaksford M. , Chater N. , & Larkin J. ( 2000 ). Probabilities and polarity biases in conditional inference. Journal of Experimental Psychology: Learning, Memory, and Cognition, 26, 883–889.

Oberauer K. ( 2006 ). Reasoning with conditionals: A test of the formal models of four theories. Cognitive Psychology, 53, 238–283.

Oberauer K. , Geiger S. M. , Fischer K. , & Weidenfeld A. ( 2007 ). Two meanings of “If”: Individual differences in the interpretation of conditionals. Quarterly Journal of Experimental Psychology, 60, 790–819.

Oberauer K. , & Wilhelm, O. ( 2003 ). The meaning(s) of conditionals: Conditional probabilities, mental models and personal utilities. Journal of Experimental Psychology: Learning, Memory, and Cognition, 29, 680–693.

Over D. E. , Hadjichristidis C. , Evans J. St. B. T. , Handley S. J. , & Sloman S. A. ( 2007 ). The probability of causal conditionals. Cognitive Psychology, 54, 62–97.

Politzer G. , & Nguyen-Xuan, A. ( 1992 ). Reasoning about conditional promises and warnings: Darwinian algorithms, mental models, relevance judgements or pragmatic schemas? Quarterly Journal of Experimental Psychology, 44, 401–412.

Reber A. S. ( 1993 ). Implicit learning and tacit knowledge. Oxford, England : Oxford University Press.

Rips L. J. ( 1994 ). The psychology of proof. Cambridge, MA : MIT Press.

Schroyens W. , Schaeken W. , & d’Ydewalle G. ( 2001 ). The processing of negations in conditional reasoning: A meta-analytic study in mental models and/or mental logic theory. Thinking and Reasoning, 7, 121–172.

Shynkarkuk J. M. , & Thompson, V. A. ( 2006 ). Confidence and accuracy in deductive reasoning. Memory and Cognition, 34, 619–632.

Sloman S. A. ( 1996 ). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3–22.

Sperber D. , Cara F. , & Girotto V. ( 1995 ). Relevance theory explains the selection task. Cognition, 57, 31–95.

Sperber D. , & Girotto, V. ( 2002 ). Use or misuse of the selection task? Rejoinder to Fiddick, Cosmides and Tooby. Cognition, 85, 277–290.

Stanovich K. E. ( 1999 ). Who is rational? Studies of individual differences in reasoning. Mahwah, NJ : Elrbaum.

Stanovich, K. E. ( 2011 ). Rationality and the reflective mind . New York: Oxford University Press.

Stanovich K. E. , & West, R. F. ( 1998 ). Cognitive ability and variation in selection task performance. Thinking and Reasoning, 4, 193–230.

Stanovich K. E. , & West, R. F. ( 2000 a). Advancing the rationality debate. Behavioral and Brain Sciences, 23, 701–726.

Stanovich K. E. , & West, R. F. ( 2000 b). Individual differences in reasoning: Implications for the rationality debate. Behavioral and Brain Sciences, 23, 645–726.

Stanovich K. E. , & West, R. F. ( 2007 ). Natural myside bias is independent of cognitive ability. Thinking and Reasoning, 13, 225.

Stanovich K. E. , & West, R. F. ( 2008 a). On the failure of cognitive ability to predict myside and one-sided thinking biases. Thinking and Reasoning, 14, 129–167.

Stanovich K. E. , & West, R. F. ( 2008 b). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94, 672–695.

Stevenson R. J. , & Over, D. E. ( 1995 ). Deduction from uncertain premises. Quarterly Journal of Experimental Psychology, 48A, 613–643.

Thompson V. A. ( 1994 ). Interpretational factors in conditional reasoning.   Memory and Cognition, 22, 742–758.

Thompson V. A. ( 2000 ). The task-specific nature of domain-general reasoning.   Cognition, 76, 209–268.

Thompson V. A. ( 2009 ). Dual-process theories: A metacognitive perspective. In J. St. B. T. Evans & K. Frankish (Eds.), In two minds: Dual processes and beyond (pp. 171–196). Oxford, England: Oxford University Press.

Thompson V. A. , Evans J. St. B. T. , & Handley S. J. ( 2005 ). Persuading and dissuading by conditional argument.   Journal of Memory and Language, 53, 238–257.

Thompson, V. A. , Prowse Turner, J. A. , & Pennycook, G. ( 2011 ). Intuition, reason, and metacognition.   Cognitive Psychology, 63, 107–140.

Tsujii T. , & Watanabee, S. ( 2009 ). Neural correlates of dual-task effect on belief-bias syllogistic reasoning: A near-infrared spectroscopty study. Brain Research, 1287, 118–125.

Verschueren N. , Schaeken W. , & d’ Ydewalle G. ( 2005 ). A dual-process specification of causal conditional reasoning. Thinking and Reasoning, 11, 239–278.

Wason P. C. ( 1966 ). Reasoning. In B. M. Foss (Ed.), New horizons in psychology I (pp. 106–137). Harmandsworth, England : Penguin.

Wason P. C. , & Evans, J. St. B. T. ( 1975 ). Dual processes in reasoning? Cognition, 3, 141–154.

Wason P. C. , & Johnson-Laird, P. N. ( 1972 ). Psychology of reasoning: Structure and content. London : Batsford.

Weidenfeld A. , Oberauer K. , & Hornig R. ( 2005 ). Causal and noncausal conditionals: An integrated model of interpretation and reasoning. Quarterly Journal of Experimental Psychology, 58, 1479–1513.

Wilkins M. C. ( 1928 ). The effect of changed material on the ability to do formal syllogistic reasoning. Archives of Psychology, 16, No 102.

Woodworth R. S. , & Sells, S. B. ( 1935 ). An atmosphere effect in syllogistic reasoning. Journal of Experimental Psychology, 18, 451–460.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

Cognitive Psychology and Cognitive Neuroscience (Wikibooks)

  • Last updated
  • Save as PDF
  • Page ID 91927

Cognitive psychology is the scientific study of mental processes such as attention, language use, memory, perception, problem solving, creativity, and reasoning. Cognitive neuroscience is the scientific field that is concerned with the study of the biological processes and aspects that underlie cognition, with a specific focus on the neural connections in the brain which are involved in mental processes. It addresses the questions of how cognitive activities are affected or controlled by neural circuits in the brain. Cognitive neuroscience is a branch of both neuroscience and psychology, overlapping with disciplines such as behavioral neuroscience, cognitive psychology, physiological psychology and affective neuroscience. Cognitive neuroscience relies upon theories in cognitive science coupled with evidence from neurobiology, and computational modeling.

mindtouch.page#thumbnail

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Overview of the Problem-Solving Mental Process

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

reasoning and problem solving in cognitive psychology

Rachel Goldman, PhD FTOS, is a licensed psychologist, clinical assistant professor, speaker, wellness expert specializing in eating behaviors, stress management, and health behavior change.

reasoning and problem solving in cognitive psychology

  • Identify the Problem
  • Define the Problem
  • Form a Strategy
  • Organize Information
  • Allocate Resources
  • Monitor Progress
  • Evaluate the Results

Frequently Asked Questions

Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue.

The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything they can about the issue and then using factual knowledge to come up with a solution. In other instances, creativity and insight are the best options.

It is not necessary to follow problem-solving steps sequentially, It is common to skip steps or even go back through steps multiple times until the desired solution is reached.

In order to correctly solve a problem, it is often important to follow a series of steps. Researchers sometimes refer to this as the problem-solving cycle. While this cycle is portrayed sequentially, people rarely follow a rigid series of steps to find a solution.

The following steps include developing strategies and organizing knowledge.

1. Identifying the Problem

While it may seem like an obvious step, identifying the problem is not always as simple as it sounds. In some cases, people might mistakenly identify the wrong source of a problem, which will make attempts to solve it inefficient or even useless.

Some strategies that you might use to figure out the source of a problem include :

  • Asking questions about the problem
  • Breaking the problem down into smaller pieces
  • Looking at the problem from different perspectives
  • Conducting research to figure out what relationships exist between different variables

2. Defining the Problem

After the problem has been identified, it is important to fully define the problem so that it can be solved. You can define a problem by operationally defining each aspect of the problem and setting goals for what aspects of the problem you will address

At this point, you should focus on figuring out which aspects of the problems are facts and which are opinions. State the problem clearly and identify the scope of the solution.

3. Forming a Strategy

After the problem has been identified, it is time to start brainstorming potential solutions. This step usually involves generating as many ideas as possible without judging their quality. Once several possibilities have been generated, they can be evaluated and narrowed down.

The next step is to develop a strategy to solve the problem. The approach used will vary depending upon the situation and the individual's unique preferences. Common problem-solving strategies include heuristics and algorithms.

  • Heuristics are mental shortcuts that are often based on solutions that have worked in the past. They can work well if the problem is similar to something you have encountered before and are often the best choice if you need a fast solution.
  • Algorithms are step-by-step strategies that are guaranteed to produce a correct result. While this approach is great for accuracy, it can also consume time and resources.

Heuristics are often best used when time is of the essence, while algorithms are a better choice when a decision needs to be as accurate as possible.

4. Organizing Information

Before coming up with a solution, you need to first organize the available information. What do you know about the problem? What do you not know? The more information that is available the better prepared you will be to come up with an accurate solution.

When approaching a problem, it is important to make sure that you have all the data you need. Making a decision without adequate information can lead to biased or inaccurate results.

5. Allocating Resources

Of course, we don't always have unlimited money, time, and other resources to solve a problem. Before you begin to solve a problem, you need to determine how high priority it is.

If it is an important problem, it is probably worth allocating more resources to solving it. If, however, it is a fairly unimportant problem, then you do not want to spend too much of your available resources on coming up with a solution.

At this stage, it is important to consider all of the factors that might affect the problem at hand. This includes looking at the available resources, deadlines that need to be met, and any possible risks involved in each solution. After careful evaluation, a decision can be made about which solution to pursue.

6. Monitoring Progress

After selecting a problem-solving strategy, it is time to put the plan into action and see if it works. This step might involve trying out different solutions to see which one is the most effective.

It is also important to monitor the situation after implementing a solution to ensure that the problem has been solved and that no new problems have arisen as a result of the proposed solution.

Effective problem-solvers tend to monitor their progress as they work towards a solution. If they are not making good progress toward reaching their goal, they will reevaluate their approach or look for new strategies .

7. Evaluating the Results

After a solution has been reached, it is important to evaluate the results to determine if it is the best possible solution to the problem. This evaluation might be immediate, such as checking the results of a math problem to ensure the answer is correct, or it can be delayed, such as evaluating the success of a therapy program after several months of treatment.

Once a problem has been solved, it is important to take some time to reflect on the process that was used and evaluate the results. This will help you to improve your problem-solving skills and become more efficient at solving future problems.

A Word From Verywell​

It is important to remember that there are many different problem-solving processes with different steps, and this is just one example. Problem-solving in real-world situations requires a great deal of resourcefulness, flexibility, resilience, and continuous interaction with the environment.

Get Advice From The Verywell Mind Podcast

Hosted by therapist Amy Morin, LCSW, this episode of The Verywell Mind Podcast shares how you can stop dwelling in a negative mindset.

Follow Now : Apple Podcasts / Spotify / Google Podcasts

You can become a better problem solving by:

  • Practicing brainstorming and coming up with multiple potential solutions to problems
  • Being open-minded and considering all possible options before making a decision
  • Breaking down problems into smaller, more manageable pieces
  • Asking for help when needed
  • Researching different problem-solving techniques and trying out new ones
  • Learning from mistakes and using them as opportunities to grow

It's important to communicate openly and honestly with your partner about what's going on. Try to see things from their perspective as well as your own. Work together to find a resolution that works for both of you. Be willing to compromise and accept that there may not be a perfect solution.

Take breaks if things are getting too heated, and come back to the problem when you feel calm and collected. Don't try to fix every problem on your own—consider asking a therapist or counselor for help and insight.

If you've tried everything and there doesn't seem to be a way to fix the problem, you may have to learn to accept it. This can be difficult, but try to focus on the positive aspects of your life and remember that every situation is temporary. Don't dwell on what's going wrong—instead, think about what's going right. Find support by talking to friends or family. Seek professional help if you're having trouble coping.

Davidson JE, Sternberg RJ, editors.  The Psychology of Problem Solving .  Cambridge University Press; 2003. doi:10.1017/CBO9780511615771

Sarathy V. Real world problem-solving .  Front Hum Neurosci . 2018;12:261. Published 2018 Jun 26. doi:10.3389/fnhum.2018.00261

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

  • Technical Support
  • Find My Rep

You are here

The Psychology of Thinking

The Psychology of Thinking Reasoning, Decision-Making and Problem-Solving

  • John Paul Minda - University of Western Ontario, Canada
  • Description

The Psychology of Thinking is an engaging, interesting and easy-to-follow guide into the essential concepts behind our reasoning, decision-making and problem-solving. Clearly structured into 3 sections, this book will;

  • Introduce your students to organisation of thought including memory, language and concepts;
  • Expand their understanding of reasoning including inference and induction as well as motivation and the impact of mood;
  • Improve their thinking in action, focusing on decision-making and problem-solving.

Suitable for any course in which students need to develop their judgement and decision-making skills, this book uses clever examples of real-world situations to help them understand and apply the theories discussed to their everyday thinking.

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

Whilst this book is primarily aimed at students of psychology, it is a really useful general resource for anyone interested in thinking and reasoning. Lawyers who understand the psychological context of thinking and reasoning will be able to distinguish good from bad thinking whether in the context of litigation or general legal advice. This book will help. Whilst the whole book is useful, I will direct students to particular chapters, especially in Part 2. This is a good, accessible book that can be used in many different courses and beyond.

I was not able to download this book and review it

Creative thinking is one of the civil engineers critical skills and up until now I hadn't discovered an text booked that explained the many facets of thinking is such a succinct and clear style. I may well be recommending this for other courses I deliver.

Some excellent chapters around memory, language and thought which will be useful to students on our postgraduate programmes.

The new edition provides an enhanced pedagogical structure, helping to further engage students with the content and aid their understanding of the concepts. Firstly, there are two newly boxed in-text features,, ' Theory in the real world’ and 'Examples in practice', both of which are designed to help students understand how the theory discussed applies to both their real-world everyday and their practice. The new edition also features brand new 'Questions to think about' at the end of each chapter, challenging students to further their understanding and consolidate their learning, as well as 'Objectives' at the start of each chapter helping to focus and frame students' learning.  Fully updated throughout, to include more global-friendly content, this new edition is suitable for students regardless of where they are studying.

Preview this book

For instructors, select a purchasing option, related products.

The SAGE Handbook of Applied Memory

COMMENTS

  1. 7 Module 7: Thinking, Reasoning, and Problem-Solving

    7.2. Reasoning and Judgment 7.3. Problem Solving READING WITH PURPOSE Remember and Understand By reading and studying Module 7, you should be able to remember and describe: Concepts and inferences (7.1) Procedural knowledge (7.1) Metacognition (7.1)

  2. Problem Solving

    The major cognitive processes in problem solving are representing, planning, executing, and monitoring. The major kinds of knowledge required for problem solving are facts, concepts, procedures, strategies, and beliefs. Classic theoretical approaches to the study of problem solving are associationism, Gestalt, and information processing.

  3. Problem-Solving Strategies and Obstacles

    In cognitive psychology, the term 'problem-solving' refers to the mental process that people go through to discover, analyze, and solve problems. A problem exists when there is a goal that we want to achieve but the process by which we will achieve it is not obvious to us.

  4. PDF Reasoning, Decision-Making & Problem-Solving

    As a topic within the study of cognitive psychology, the psychology of thinking is concerned with complex mental behaviours, such as problem- solving, reasoning, decision-making, and becoming an expert. A good understanding of basic cognition is very useful in understanding the psychology of thinking, but it is not neces-sary.

  5. PDF COGNITION Chapter 9: Problem Solving Fundamentals of Cognitive Psychology

    Problem-solving is the identification and selection of solutions to the problem. Problem Solving Directed and Undirected Thinking Directed: Goal-oriented and rational Requires a clear well-defined goal Undirected: Meanders (day dreams, dreaming, drifting thoughts, etc.) Plays a role in creativity and poorly-defined problems

  6. The Process of Problem Solving

    In a 2013 article published in the Journal of Cognitive Psychology, Ngar Yin Louis Lee (Chinese University of Hong Kong) and APS William James Fellow Philip N. Johnson-Laird (Princeton University) examined the ways people develop strategies to solve related problems.

  7. John Paul Minda, The Psychology of Thinking: Reasoning, Decision-Making

    Thinking as subject matter of cognition, is covered as a section by most texts on cognitive psychology; The Psychology of Thinking: Reasoning, Decision-Making, and Problem-Solving is aimed at studying the psychology of thinking and evaluating advanced concepts of cognition using a thinking approach.

  8. Cognitive Psychology: The Science of How We Think

    Cognitive psychology involves the study of internal mental processes—all of the workings inside your brain, including perception, thinking, memory, attention, language, problem-solving, and learning. Cognitive psychology--the study of how people think and process information--helps researchers understand the human brain.

  9. Reasoning and Problem Solving

    This chapter provides a revised review of the psychological literature on reasoning and problem solving. Four classes of deductive reasoning are presented, including rule (mental logic) theories, semantic (mental model) theories, evolutionary theories, and heuristic theories.

  10. Problem solving.

    Problem solving refers to cognitive processing directed at achieving a goal when the problem solver does not initially know a solution method. A problem exists when someone has a goal but does not know how to achieve it. Problems can be classified as routine or nonroutine, and as well defined or ill-defined.

  11. Complex cognition: the science of human reasoning, problem-solving, and

    The present "Special Corner: complex cognition" deals with questions in this regard that have often received little consideration. Under the headline "complex cognition", we summarize mental activities such as thinking, reasoning, problem-solving, and decision-making that typically rely on the combination and interaction of more elementary processes such as perception, learning, memory ...

  12. The Psychology of Thinking : Reasoning, Decision-Making and Problem-Solving

    These three overarching themes provide a broad theoretical framework with which to explore wider issues in cognition and cognitive psychology and there are chapters on motivation and language plus a strong focus on problem solving, reasoning and decision making - all of which are central to a solid understanding of this field.

  13. Cognition in Psychology: Definition, Types, Effects, and Tips

    Cognitive psychology is the field of psychology that investigates how people think and the processes involved in cognition. What is an example of cognition? Cognition includes all of the conscious and unconscious processes involved in thinking, perceiving, and reasoning.

  14. Introduction to Thinking and Problem-Solving

    Introduction to Thinking and Problem-Solving What you'll learn to do: describe cognition and problem-solving strategies Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion?

  15. 7.3 Problem-Solving

    Within psychology, problem solving refers to a motivational drive for reading a definite "goal" from a present situation or condition that is either not moving toward that goal, is distant from it, or requires more complex logical analysis for finding a missing description of conditions or steps toward that goal.

  16. The Science of Thinking: An Introduction to Cognitive Psychology

    Cognitive psychology is a branch of psychology that focuses on studying mental processes such as thinking, perception, memory, language, problem-solving, and decision-making. It seeks to understand how humans acquire, store, process, and use information daily. Cognitive psychology explores the mind's inner workings and investigates how ...

  17. Sage Academic Books

    These three overarching themes provide a broad theoretical framework with which to explore wider issues in cognition and cognitive psychology and there are chapters on motivation and language plus a strong focus on problem solving, reasoning and decision making - all of which are central to a solid understanding of this field.

  18. Cognitive Processing

    Cognitive Processing - Reasoning, Decisions and Problem Solving Cognitive Processing - Reasoning, Decisions and Problem Solving Learn about the cognitive processes behind decision-making and problem-solving in this free online psychology course. This psychology course teaches you the techniques and approaches that drive cognitive processing.

  19. Reasoning & Problem Solving

    450 Jane Stanford Way Building 420 Stanford University Stanford, CA 94305 Campus Map

  20. Reasoning

    This chapter covers the traditional study of deductive reasoning together with a change in the paradigm that has occurred in recent years. ... Expand Part 9 Problem Solving and Creativity 48 Problem Solving ... ' Reasoning', in Daniel Reisberg (ed.), The Oxford Handbook of Cognitive Psychology, Oxford Library of Psychology (2013; online edn ...

  21. Cognitive Psychology and Cognitive Neuroscience (Wikibooks)

    Cognitive psychology is the scientific study of mental processes such as attention, language use, memory, perception, problem solving, creativity, and reasoning. Cognitive neuroscience is the scientific field that is concerned with the study of the biological processes and aspects that underlie cognition, with a specific focus on the neural ...

  22. The Problem-Solving Process

    Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue. The best strategy for solving a problem depends largely on the unique situation.

  23. The Psychology of Thinking

    Features. Preview. The Psychology of Thinking is an engaging, interesting and easy-to-follow guide into the essential concepts behind our reasoning, decision-making and problem-solving. Clearly structured into 3 sections, this book will; Introduce your students to organisation of thought including memory, language and concepts;