Logo for College of DuPage Digital Press

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 Module 7: Thinking, Reasoning, and Problem-Solving

This module is about how a solid working knowledge of psychological principles can help you to think more effectively, so you can succeed in school and life. You might be inclined to believe that—because you have been thinking for as long as you can remember, because you are able to figure out the solution to many problems, because you feel capable of using logic to argue a point, because you can evaluate whether the things you read and hear make sense—you do not need any special training in thinking. But this, of course, is one of the key barriers to helping people think better. If you do not believe that there is anything wrong, why try to fix it?

The human brain is indeed a remarkable thinking machine, capable of amazing, complex, creative, logical thoughts. Why, then, are we telling you that you need to learn how to think? Mainly because one major lesson from cognitive psychology is that these capabilities of the human brain are relatively infrequently realized. Many psychologists believe that people are essentially “cognitive misers.” It is not that we are lazy, but that we have a tendency to expend the least amount of mental effort necessary. Although you may not realize it, it actually takes a great deal of energy to think. Careful, deliberative reasoning and critical thinking are very difficult. Because we seem to be successful without going to the trouble of using these skills well, it feels unnecessary to develop them. As you shall see, however, there are many pitfalls in the cognitive processes described in this module. When people do not devote extra effort to learning and improving reasoning, problem solving, and critical thinking skills, they make many errors.

As is true for memory, if you develop the cognitive skills presented in this module, you will be more successful in school. It is important that you realize, however, that these skills will help you far beyond school, even more so than a good memory will. Although it is somewhat useful to have a good memory, ten years from now no potential employer will care how many questions you got right on multiple choice exams during college. All of them will, however, recognize whether you are a logical, analytical, critical thinker. With these thinking skills, you will be an effective, persuasive communicator and an excellent problem solver.

The module begins by describing different kinds of thought and knowledge, especially conceptual knowledge and critical thinking. An understanding of these differences will be valuable as you progress through school and encounter different assignments that require you to tap into different kinds of knowledge. The second section covers deductive and inductive reasoning, which are processes we use to construct and evaluate strong arguments. They are essential skills to have whenever you are trying to persuade someone (including yourself) of some point, or to respond to someone’s efforts to persuade you. The module ends with a section about problem solving. A solid understanding of the key processes involved in problem solving will help you to handle many daily challenges.

7.1. Different kinds of thought

7.2. Reasoning and Judgment

7.3. Problem Solving

READING WITH PURPOSE

Remember and understand.

By reading and studying Module 7, you should be able to remember and describe:

  • Concepts and inferences (7.1)
  • Procedural knowledge (7.1)
  • Metacognition (7.1)
  • Characteristics of critical thinking:  skepticism; identify biases, distortions, omissions, and assumptions; reasoning and problem solving skills  (7.1)
  • Reasoning:  deductive reasoning, deductively valid argument, inductive reasoning, inductively strong argument, availability heuristic, representativeness heuristic  (7.2)
  • Fixation:  functional fixedness, mental set  (7.3)
  • Algorithms, heuristics, and the role of confirmation bias (7.3)
  • Effective problem solving sequence (7.3)

By reading and thinking about how the concepts in Module 6 apply to real life, you should be able to:

  • Identify which type of knowledge a piece of information is (7.1)
  • Recognize examples of deductive and inductive reasoning (7.2)
  • Recognize judgments that have probably been influenced by the availability heuristic (7.2)
  • Recognize examples of problem solving heuristics and algorithms (7.3)

Analyze, Evaluate, and Create

By reading and thinking about Module 6, participating in classroom activities, and completing out-of-class assignments, you should be able to:

  • Use the principles of critical thinking to evaluate information (7.1)
  • Explain whether examples of reasoning arguments are deductively valid or inductively strong (7.2)
  • Outline how you could try to solve a problem from your life using the effective problem solving sequence (7.3)

7.1. Different kinds of thought and knowledge

  • Take a few minutes to write down everything that you know about dogs.
  • Do you believe that:
  • Psychic ability exists?
  • Hypnosis is an altered state of consciousness?
  • Magnet therapy is effective for relieving pain?
  • Aerobic exercise is an effective treatment for depression?
  • UFO’s from outer space have visited earth?

On what do you base your belief or disbelief for the questions above?

Of course, we all know what is meant by the words  think  and  knowledge . You probably also realize that they are not unitary concepts; there are different kinds of thought and knowledge. In this section, let us look at some of these differences. If you are familiar with these different kinds of thought and pay attention to them in your classes, it will help you to focus on the right goals, learn more effectively, and succeed in school. Different assignments and requirements in school call on you to use different kinds of knowledge or thought, so it will be very helpful for you to learn to recognize them (Anderson, et al. 2001).

Factual and conceptual knowledge

Module 5 introduced the idea of declarative memory, which is composed of facts and episodes. If you have ever played a trivia game or watched Jeopardy on TV, you realize that the human brain is able to hold an extraordinary number of facts. Likewise, you realize that each of us has an enormous store of episodes, essentially facts about events that happened in our own lives. It may be difficult to keep that in mind when we are struggling to retrieve one of those facts while taking an exam, however. Part of the problem is that, in contradiction to the advice from Module 5, many students continue to try to memorize course material as a series of unrelated facts (picture a history student simply trying to memorize history as a set of unrelated dates without any coherent story tying them together). Facts in the real world are not random and unorganized, however. It is the way that they are organized that constitutes a second key kind of knowledge, conceptual.

Concepts are nothing more than our mental representations of categories of things in the world. For example, think about dogs. When you do this, you might remember specific facts about dogs, such as they have fur and they bark. You may also recall dogs that you have encountered and picture them in your mind. All of this information (and more) makes up your concept of dog. You can have concepts of simple categories (e.g., triangle), complex categories (e.g., small dogs that sleep all day, eat out of the garbage, and bark at leaves), kinds of people (e.g., psychology professors), events (e.g., birthday parties), and abstract ideas (e.g., justice). Gregory Murphy (2002) refers to concepts as the “glue that holds our mental life together” (p. 1). Very simply, summarizing the world by using concepts is one of the most important cognitive tasks that we do. Our conceptual knowledge  is  our knowledge about the world. Individual concepts are related to each other to form a rich interconnected network of knowledge. For example, think about how the following concepts might be related to each other: dog, pet, play, Frisbee, chew toy, shoe. Or, of more obvious use to you now, how these concepts are related: working memory, long-term memory, declarative memory, procedural memory, and rehearsal? Because our minds have a natural tendency to organize information conceptually, when students try to remember course material as isolated facts, they are working against their strengths.

One last important point about concepts is that they allow you to instantly know a great deal of information about something. For example, if someone hands you a small red object and says, “here is an apple,” they do not have to tell you, “it is something you can eat.” You already know that you can eat it because it is true by virtue of the fact that the object is an apple; this is called drawing an  inference , assuming that something is true on the basis of your previous knowledge (for example, of category membership or of how the world works) or logical reasoning.

Procedural knowledge

Physical skills, such as tying your shoes, doing a cartwheel, and driving a car (or doing all three at the same time, but don’t try this at home) are certainly a kind of knowledge. They are procedural knowledge, the same idea as procedural memory that you saw in Module 5. Mental skills, such as reading, debating, and planning a psychology experiment, are procedural knowledge, as well. In short, procedural knowledge is the knowledge how to do something (Cohen & Eichenbaum, 1993).

Metacognitive knowledge

Floyd used to think that he had a great memory. Now, he has a better memory. Why? Because he finally realized that his memory was not as great as he once thought it was. Because Floyd eventually learned that he often forgets where he put things, he finally developed the habit of putting things in the same place. (Unfortunately, he did not learn this lesson before losing at least 5 watches and a wedding ring.) Because he finally realized that he often forgets to do things, he finally started using the To Do list app on his phone. And so on. Floyd’s insights about the real limitations of his memory have allowed him to remember things that he used to forget.

All of us have knowledge about the way our own minds work. You may know that you have a good memory for people’s names and a poor memory for math formulas. Someone else might realize that they have difficulty remembering to do things, like stopping at the store on the way home. Others still know that they tend to overlook details. This knowledge about our own thinking is actually quite important; it is called metacognitive knowledge, or  metacognition . Like other kinds of thinking skills, it is subject to error. For example, in unpublished research, one of the authors surveyed about 120 General Psychology students on the first day of the term. Among other questions, the students were asked them to predict their grade in the class and report their current Grade Point Average. Two-thirds of the students predicted that their grade in the course would be higher than their GPA. (The reality is that at our college, students tend to earn lower grades in psychology than their overall GPA.) Another example: Students routinely report that they thought they had done well on an exam, only to discover, to their dismay, that they were wrong (more on that important problem in a moment). Both errors reveal a breakdown in metacognition.

The Dunning-Kruger Effect

In general, most college students probably do not study enough. For example, using data from the National Survey of Student Engagement, Fosnacht, McCormack, and Lerma (2018) reported that first-year students at 4-year colleges in the U.S. averaged less than 14 hours per week preparing for classes. The typical suggestion is that you should spend two hours outside of class for every hour in class, or 24 – 30 hours per week for a full-time student. Clearly, students in general are nowhere near that recommended mark. Many observers, including some faculty, believe that this shortfall is a result of students being too busy or lazy. Now, it may be true that many students are too busy, with work and family obligations, for example. Others, are not particularly motivated in school, and therefore might correctly be labeled lazy. A third possible explanation, however, is that some students might not think they need to spend this much time. And this is a matter of metacognition. Consider the scenario that we mentioned above, students thinking they had done well on an exam only to discover that they did not. Justin Kruger and David Dunning examined scenarios very much like this in 1999. Kruger and Dunning gave research participants tests measuring humor, logic, and grammar. Then, they asked the participants to assess their own abilities and test performance in these areas. They found that participants in general tended to overestimate their abilities, already a problem with metacognition. Importantly, the participants who scored the lowest overestimated their abilities the most. Specifically, students who scored in the bottom quarter (averaging in the 12th percentile) thought they had scored in the 62nd percentile. This has become known as the  Dunning-Kruger effect . Many individual faculty members have replicated these results with their own student on their course exams, including the authors of this book. Think about it. Some students who just took an exam and performed poorly believe that they did well before seeing their score. It seems very likely that these are the very same students who stopped studying the night before because they thought they were “done.” Quite simply, it is not just that they did not know the material. They did not know that they did not know the material. That is poor metacognition.

In order to develop good metacognitive skills, you should continually monitor your thinking and seek frequent feedback on the accuracy of your thinking (Medina, Castleberry, & Persky 2017). For example, in classes get in the habit of predicting your exam grades. As soon as possible after taking an exam, try to find out which questions you missed and try to figure out why. If you do this soon enough, you may be able to recall the way it felt when you originally answered the question. Did you feel confident that you had answered the question correctly? Then you have just discovered an opportunity to improve your metacognition. Be on the lookout for that feeling and respond with caution.

concept :  a mental representation of a category of things in the world

Dunning-Kruger effect : individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

inference : an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

metacognition :  knowledge about one’s own cognitive processes; thinking about your thinking

Critical thinking

One particular kind of knowledge or thinking skill that is related to metacognition is  critical thinking (Chew, 2020). You may have noticed that critical thinking is an objective in many college courses, and thus it could be a legitimate topic to cover in nearly any college course. It is particularly appropriate in psychology, however. As the science of (behavior and) mental processes, psychology is obviously well suited to be the discipline through which you should be introduced to this important way of thinking.

More importantly, there is a particular need to use critical thinking in psychology. We are all, in a way, experts in human behavior and mental processes, having engaged in them literally since birth. Thus, perhaps more than in any other class, students typically approach psychology with very clear ideas and opinions about its subject matter. That is, students already “know” a lot about psychology. The problem is, “it ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so” (Ward, quoted in Gilovich 1991). Indeed, many of students’ preconceptions about psychology are just plain wrong. Randolph Smith (2002) wrote a book about critical thinking in psychology called  Challenging Your Preconceptions,  highlighting this fact. On the other hand, many of students’ preconceptions about psychology are just plain right! But wait, how do you know which of your preconceptions are right and which are wrong? And when you come across a research finding or theory in this class that contradicts your preconceptions, what will you do? Will you stick to your original idea, discounting the information from the class? Will you immediately change your mind? Critical thinking can help us sort through this confusing mess.

But what is critical thinking? The goal of critical thinking is simple to state (but extraordinarily difficult to achieve): it is to be right, to draw the correct conclusions, to believe in things that are true and to disbelieve things that are false. We will provide two definitions of critical thinking (or, if you like, one large definition with two distinct parts). First, a more conceptual one: Critical thinking is thinking like a scientist in your everyday life (Schmaltz, Jansen, & Wenckowski, 2017).  Our second definition is more operational; it is simply a list of skills that are essential to be a critical thinker. Critical thinking entails solid reasoning and problem solving skills; skepticism; and an ability to identify biases, distortions, omissions, and assumptions. Excellent deductive and inductive reasoning, and problem solving skills contribute to critical thinking. So, you can consider the subject matter of sections 7.2 and 7.3 to be part of critical thinking. Because we will be devoting considerable time to these concepts in the rest of the module, let us begin with a discussion about the other aspects of critical thinking.

Let’s address that first part of the definition. Scientists form hypotheses, or predictions about some possible future observations. Then, they collect data, or information (think of this as making those future observations). They do their best to make unbiased observations using reliable techniques that have been verified by others. Then, and only then, they draw a conclusion about what those observations mean. Oh, and do not forget the most important part. “Conclusion” is probably not the most appropriate word because this conclusion is only tentative. A scientist is always prepared that someone else might come along and produce new observations that would require a new conclusion be drawn. Wow! If you like to be right, you could do a lot worse than using a process like this.

A Critical Thinker’s Toolkit 

Now for the second part of the definition. Good critical thinkers (and scientists) rely on a variety of tools to evaluate information. Perhaps the most recognizable tool for critical thinking is  skepticism (and this term provides the clearest link to the thinking like a scientist definition, as you are about to see). Some people intend it as an insult when they call someone a skeptic. But if someone calls you a skeptic, if they are using the term correctly, you should consider it a great compliment. Simply put, skepticism is a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided. People from Missouri should recognize this principle, as Missouri is known as the Show-Me State. As a skeptic, you are not inclined to believe something just because someone said so, because someone else believes it, or because it sounds reasonable. You must be persuaded by high quality evidence.

Of course, if that evidence is produced, you have a responsibility as a skeptic to change your belief. Failure to change a belief in the face of good evidence is not skepticism; skepticism has open mindedness at its core. M. Neil Browne and Stuart Keeley (2018) use the term weak sense critical thinking to describe critical thinking behaviors that are used only to strengthen a prior belief. Strong sense critical thinking, on the other hand, has as its goal reaching the best conclusion. Sometimes that means strengthening your prior belief, but sometimes it means changing your belief to accommodate the better evidence.

Many times, a failure to think critically or weak sense critical thinking is related to a  bias , an inclination, tendency, leaning, or prejudice. Everybody has biases, but many people are unaware of them. Awareness of your own biases gives you the opportunity to control or counteract them. Unfortunately, however, many people are happy to let their biases creep into their attempts to persuade others; indeed, it is a key part of their persuasive strategy. To see how these biases influence messages, just look at the different descriptions and explanations of the same events given by people of different ages or income brackets, or conservative versus liberal commentators, or by commentators from different parts of the world. Of course, to be successful, these people who are consciously using their biases must disguise them. Even undisguised biases can be difficult to identify, so disguised ones can be nearly impossible.

Here are some common sources of biases:

  • Personal values and beliefs.  Some people believe that human beings are basically driven to seek power and that they are typically in competition with one another over scarce resources. These beliefs are similar to the world-view that political scientists call “realism.” Other people believe that human beings prefer to cooperate and that, given the chance, they will do so. These beliefs are similar to the world-view known as “idealism.” For many people, these deeply held beliefs can influence, or bias, their interpretations of such wide ranging situations as the behavior of nations and their leaders or the behavior of the driver in the car ahead of you. For example, if your worldview is that people are typically in competition and someone cuts you off on the highway, you may assume that the driver did it purposely to get ahead of you. Other types of beliefs about the way the world is or the way the world should be, for example, political beliefs, can similarly become a significant source of bias.
  • Racism, sexism, ageism and other forms of prejudice and bigotry.  These are, sadly, a common source of bias in many people. They are essentially a special kind of “belief about the way the world is.” These beliefs—for example, that women do not make effective leaders—lead people to ignore contradictory evidence (examples of effective women leaders, or research that disputes the belief) and to interpret ambiguous evidence in a way consistent with the belief.
  • Self-interest.  When particular people benefit from things turning out a certain way, they can sometimes be very susceptible to letting that interest bias them. For example, a company that will earn a profit if they sell their product may have a bias in the way that they give information about their product. A union that will benefit if its members get a generous contract might have a bias in the way it presents information about salaries at competing organizations. (Note that our inclusion of examples describing both companies and unions is an explicit attempt to control for our own personal biases). Home buyers are often dismayed to discover that they purchased their dream house from someone whose self-interest led them to lie about flooding problems in the basement or back yard. This principle, the biasing power of self-interest, is likely what led to the famous phrase  Caveat Emptor  (let the buyer beware) .  

Knowing that these types of biases exist will help you evaluate evidence more critically. Do not forget, though, that people are not always keen to let you discover the sources of biases in their arguments. For example, companies or political organizations can sometimes disguise their support of a research study by contracting with a university professor, who comes complete with a seemingly unbiased institutional affiliation, to conduct the study.

People’s biases, conscious or unconscious, can lead them to make omissions, distortions, and assumptions that undermine our ability to correctly evaluate evidence. It is essential that you look for these elements. Always ask, what is missing, what is not as it appears, and what is being assumed here? For example, consider this (fictional) chart from an ad reporting customer satisfaction at 4 local health clubs.

human thinking and problem solving

Clearly, from the results of the chart, one would be tempted to give Club C a try, as customer satisfaction is much higher than for the other 3 clubs.

There are so many distortions and omissions in this chart, however, that it is actually quite meaningless. First, how was satisfaction measured? Do the bars represent responses to a survey? If so, how were the questions asked? Most importantly, where is the missing scale for the chart? Although the differences look quite large, are they really?

Well, here is the same chart, with a different scale, this time labeled:

human thinking and problem solving

Club C is not so impressive any more, is it? In fact, all of the health clubs have customer satisfaction ratings (whatever that means) between 85% and 88%. In the first chart, the entire scale of the graph included only the percentages between 83 and 89. This “judicious” choice of scale—some would call it a distortion—and omission of that scale from the chart make the tiny differences among the clubs seem important, however.

Also, in order to be a critical thinker, you need to learn to pay attention to the assumptions that underlie a message. Let us briefly illustrate the role of assumptions by touching on some people’s beliefs about the criminal justice system in the US. Some believe that a major problem with our judicial system is that many criminals go free because of legal technicalities. Others believe that a major problem is that many innocent people are convicted of crimes. The simple fact is, both types of errors occur. A person’s conclusion about which flaw in our judicial system is the greater tragedy is based on an assumption about which of these is the more serious error (letting the guilty go free or convicting the innocent). This type of assumption is called a value assumption (Browne and Keeley, 2018). It reflects the differences in values that people develop, differences that may lead us to disregard valid evidence that does not fit in with our particular values.

Oh, by the way, some students probably noticed this, but the seven tips for evaluating information that we shared in Module 1 are related to this. Actually, they are part of this section. The tips are, to a very large degree, set of ideas you can use to help you identify biases, distortions, omissions, and assumptions. If you do not remember this section, we strongly recommend you take a few minutes to review it.

skepticism :  a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

bias : an inclination, tendency, leaning, or prejudice

  • Which of your beliefs (or disbeliefs) from the Activate exercise for this section were derived from a process of critical thinking? If some of your beliefs were not based on critical thinking, are you willing to reassess these beliefs? If the answer is no, why do you think that is? If the answer is yes, what concrete steps will you take?

7.2 Reasoning and Judgment

  • What percentage of kidnappings are committed by strangers?
  • Which area of the house is riskiest: kitchen, bathroom, or stairs?
  • What is the most common cancer in the US?
  • What percentage of workplace homicides are committed by co-workers?

An essential set of procedural thinking skills is  reasoning , the ability to generate and evaluate solid conclusions from a set of statements or evidence. You should note that these conclusions (when they are generated instead of being evaluated) are one key type of inference that we described in Section 7.1. There are two main types of reasoning, deductive and inductive.

Deductive reasoning

Suppose your teacher tells you that if you get an A on the final exam in a course, you will get an A for the whole course. Then, you get an A on the final exam. What will your final course grade be? Most people can see instantly that you can conclude with certainty that you will get an A for the course. This is a type of reasoning called  deductive reasoning , which is defined as reasoning in which a conclusion is guaranteed to be true as long as the statements leading to it are true. The three statements can be listed as an  argument , with two beginning statements and a conclusion:

Statement 1: If you get an A on the final exam, you will get an A for the course

Statement 2: You get an A on the final exam

Conclusion: You will get an A for the course

This particular arrangement, in which true beginning statements lead to a guaranteed true conclusion, is known as a  deductively valid argument . Although deductive reasoning is often the subject of abstract, brain-teasing, puzzle-like word problems, it is actually an extremely important type of everyday reasoning. It is just hard to recognize sometimes. For example, imagine that you are looking for your car keys and you realize that they are either in the kitchen drawer or in your book bag. After looking in the kitchen drawer, you instantly know that they must be in your book bag. That conclusion results from a simple deductive reasoning argument. In addition, solid deductive reasoning skills are necessary for you to succeed in the sciences, philosophy, math, computer programming, and any endeavor involving the use of logic to persuade others to your point of view or to evaluate others’ arguments.

Cognitive psychologists, and before them philosophers, have been quite interested in deductive reasoning, not so much for its practical applications, but for the insights it can offer them about the ways that human beings think. One of the early ideas to emerge from the examination of deductive reasoning is that people learn (or develop) mental versions of rules that allow them to solve these types of reasoning problems (Braine, 1978; Braine, Reiser, & Rumain, 1984). The best way to see this point of view is to realize that there are different possible rules, and some of them are very simple. For example, consider this rule of logic:

therefore q

Logical rules are often presented abstractly, as letters, in order to imply that they can be used in very many specific situations. Here is a concrete version of the of the same rule:

I’ll either have pizza or a hamburger for dinner tonight (p or q)

I won’t have pizza (not p)

Therefore, I’ll have a hamburger (therefore q)

This kind of reasoning seems so natural, so easy, that it is quite plausible that we would use a version of this rule in our daily lives. At least, it seems more plausible than some of the alternative possibilities—for example, that we need to have experience with the specific situation (pizza or hamburger, in this case) in order to solve this type of problem easily. So perhaps there is a form of natural logic (Rips, 1990) that contains very simple versions of logical rules. When we are faced with a reasoning problem that maps onto one of these rules, we use the rule.

But be very careful; things are not always as easy as they seem. Even these simple rules are not so simple. For example, consider the following rule. Many people fail to realize that this rule is just as valid as the pizza or hamburger rule above.

if p, then q

therefore, not p

Concrete version:

If I eat dinner, then I will have dessert

I did not have dessert

Therefore, I did not eat dinner

The simple fact is, it can be very difficult for people to apply rules of deductive logic correctly; as a result, they make many errors when trying to do so. Is this a deductively valid argument or not?

Students who like school study a lot

Students who study a lot get good grades

Jane does not like school

Therefore, Jane does not get good grades

Many people are surprised to discover that this is not a logically valid argument; the conclusion is not guaranteed to be true from the beginning statements. Although the first statement says that students who like school study a lot, it does NOT say that students who do not like school do not study a lot. In other words, it may very well be possible to study a lot without liking school. Even people who sometimes get problems like this right might not be using the rules of deductive reasoning. Instead, they might just be making judgments for examples they know, in this case, remembering instances of people who get good grades despite not liking school.

Making deductive reasoning even more difficult is the fact that there are two important properties that an argument may have. One, it can be valid or invalid (meaning that the conclusion does or does not follow logically from the statements leading up to it). Two, an argument (or more correctly, its conclusion) can be true or false. Here is an example of an argument that is logically valid, but has a false conclusion (at least we think it is false).

Either you are eleven feet tall or the Grand Canyon was created by a spaceship crashing into the earth.

You are not eleven feet tall

Therefore the Grand Canyon was created by a spaceship crashing into the earth

This argument has the exact same form as the pizza or hamburger argument above, making it is deductively valid. The conclusion is so false, however, that it is absurd (of course, the reason the conclusion is false is that the first statement is false). When people are judging arguments, they tend to not observe the difference between deductive validity and the empirical truth of statements or conclusions. If the elements of an argument happen to be true, people are likely to judge the argument logically valid; if the elements are false, they will very likely judge it invalid (Markovits & Bouffard-Bouchard, 1992; Moshman & Franks, 1986). Thus, it seems a stretch to say that people are using these logical rules to judge the validity of arguments. Many psychologists believe that most people actually have very limited deductive reasoning skills (Johnson-Laird, 1999). They argue that when faced with a problem for which deductive logic is required, people resort to some simpler technique, such as matching terms that appear in the statements and the conclusion (Evans, 1982). This might not seem like a problem, but what if reasoners believe that the elements are true and they happen to be wrong; they will would believe that they are using a form of reasoning that guarantees they are correct and yet be wrong.

deductive reasoning :  a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

argument :  a set of statements in which the beginning statements lead to a conclusion

deductively valid argument :  an argument for which true beginning statements guarantee that the conclusion is true

Inductive reasoning and judgment

Every day, you make many judgments about the likelihood of one thing or another. Whether you realize it or not, you are practicing  inductive reasoning   on a daily basis. In inductive reasoning arguments, a conclusion is likely whenever the statements preceding it are true. The first thing to notice about inductive reasoning is that, by definition, you can never be sure about your conclusion; you can only estimate how likely the conclusion is. Inductive reasoning may lead you to focus on Memory Encoding and Recoding when you study for the exam, but it is possible the instructor will ask more questions about Memory Retrieval instead. Unlike deductive reasoning, the conclusions you reach through inductive reasoning are only probable, not certain. That is why scientists consider inductive reasoning weaker than deductive reasoning. But imagine how hard it would be for us to function if we could not act unless we were certain about the outcome.

Inductive reasoning can be represented as logical arguments consisting of statements and a conclusion, just as deductive reasoning can be. In an inductive argument, you are given some statements and a conclusion (or you are given some statements and must draw a conclusion). An argument is  inductively strong   if the conclusion would be very probable whenever the statements are true. So, for example, here is an inductively strong argument:

  • Statement #1: The forecaster on Channel 2 said it is going to rain today.
  • Statement #2: The forecaster on Channel 5 said it is going to rain today.
  • Statement #3: It is very cloudy and humid.
  • Statement #4: You just heard thunder.
  • Conclusion (or judgment): It is going to rain today.

Think of the statements as evidence, on the basis of which you will draw a conclusion. So, based on the evidence presented in the four statements, it is very likely that it will rain today. Will it definitely rain today? Certainly not. We can all think of times that the weather forecaster was wrong.

A true story: Some years ago psychology student was watching a baseball playoff game between the St. Louis Cardinals and the Los Angeles Dodgers. A graphic on the screen had just informed the audience that the Cardinal at bat, (Hall of Fame shortstop) Ozzie Smith, a switch hitter batting left-handed for this plate appearance, had never, in nearly 3000 career at-bats, hit a home run left-handed. The student, who had just learned about inductive reasoning in his psychology class, turned to his companion (a Cardinals fan) and smugly said, “It is an inductively strong argument that Ozzie Smith will not hit a home run.” He turned back to face the television just in time to watch the ball sail over the right field fence for a home run. Although the student felt foolish at the time, he was not wrong. It was an inductively strong argument; 3000 at-bats is an awful lot of evidence suggesting that the Wizard of Ozz (as he was known) would not be hitting one out of the park (think of each at-bat without a home run as a statement in an inductive argument). Sadly (for the die-hard Cubs fan and Cardinals-hating student), despite the strength of the argument, the conclusion was wrong.

Given the possibility that we might draw an incorrect conclusion even with an inductively strong argument, we really want to be sure that we do, in fact, make inductively strong arguments. If we judge something probable, it had better be probable. If we judge something nearly impossible, it had better not happen. Think of inductive reasoning, then, as making reasonably accurate judgments of the probability of some conclusion given a set of evidence.

We base many decisions in our lives on inductive reasoning. For example:

Statement #1: Psychology is not my best subject

Statement #2: My psychology instructor has a reputation for giving difficult exams

Statement #3: My first psychology exam was much harder than I expected

Judgment: The next exam will probably be very difficult.

Decision: I will study tonight instead of watching Netflix.

Some other examples of judgments that people commonly make in a school context include judgments of the likelihood that:

  • A particular class will be interesting/useful/difficult
  • You will be able to finish writing a paper by next week if you go out tonight
  • Your laptop’s battery will last through the next trip to the library
  • You will not miss anything important if you skip class tomorrow
  • Your instructor will not notice if you skip class tomorrow
  • You will be able to find a book that you will need for a paper
  • There will be an essay question about Memory Encoding on the next exam

Tversky and Kahneman (1983) recognized that there are two general ways that we might make these judgments; they termed them extensional (i.e., following the laws of probability) and intuitive (i.e., using shortcuts or heuristics, see below). We will use a similar distinction between Type 1 and Type 2 thinking, as described by Keith Stanovich and his colleagues (Evans and Stanovich, 2013; Stanovich and West, 2000). Type 1 thinking is fast, automatic, effortful, and emotional. In fact, it is hardly fair to call it reasoning at all, as judgments just seem to pop into one’s head. Type 2 thinking , on the other hand, is slow, effortful, and logical. So obviously, it is more likely to lead to a correct judgment, or an optimal decision. The problem is, we tend to over-rely on Type 1. Now, we are not saying that Type 2 is the right way to go for every decision or judgment we make. It seems a bit much, for example, to engage in a step-by-step logical reasoning procedure to decide whether we will have chicken or fish for dinner tonight.

Many bad decisions in some very important contexts, however, can be traced back to poor judgments of the likelihood of certain risks or outcomes that result from the use of Type 1 when a more logical reasoning process would have been more appropriate. For example:

Statement #1: It is late at night.

Statement #2: Albert has been drinking beer for the past five hours at a party.

Statement #3: Albert is not exactly sure where he is or how far away home is.

Judgment: Albert will have no difficulty walking home.

Decision: He walks home alone.

As you can see in this example, the three statements backing up the judgment do not really support it. In other words, this argument is not inductively strong because it is based on judgments that ignore the laws of probability. What are the chances that someone facing these conditions will be able to walk home alone easily? And one need not be drunk to make poor decisions based on judgments that just pop into our heads.

The truth is that many of our probability judgments do not come very close to what the laws of probability say they should be. Think about it. In order for us to reason in accordance with these laws, we would need to know the laws of probability, which would allow us to calculate the relationship between particular pieces of evidence and the probability of some outcome (i.e., how much likelihood should change given a piece of evidence), and we would have to do these heavy math calculations in our heads. After all, that is what Type 2 requires. Needless to say, even if we were motivated, we often do not even know how to apply Type 2 reasoning in many cases.

So what do we do when we don’t have the knowledge, skills, or time required to make the correct mathematical judgment? Do we hold off and wait until we can get better evidence? Do we read up on probability and fire up our calculator app so we can compute the correct probability? Of course not. We rely on Type 1 thinking. We “wing it.” That is, we come up with a likelihood estimate using some means at our disposal. Psychologists use the term heuristic to describe the type of “winging it” we are talking about. A  heuristic   is a shortcut strategy that we use to make some judgment or solve some problem (see Section 7.3). Heuristics are easy and quick, think of them as the basic procedures that are characteristic of Type 1.  They can absolutely lead to reasonably good judgments and decisions in some situations (like choosing between chicken and fish for dinner). They are, however, far from foolproof. There are, in fact, quite a lot of situations in which heuristics can lead us to make incorrect judgments, and in many cases the decisions based on those judgments can have serious consequences.

Let us return to the activity that begins this section. You were asked to judge the likelihood (or frequency) of certain events and risks. You were free to come up with your own evidence (or statements) to make these judgments. This is where a heuristic crops up. As a judgment shortcut, we tend to generate specific examples of those very events to help us decide their likelihood or frequency. For example, if we are asked to judge how common, frequent, or likely a particular type of cancer is, many of our statements would be examples of specific cancer cases:

Statement #1: Andy Kaufman (comedian) had lung cancer.

Statement #2: Colin Powell (US Secretary of State) had prostate cancer.

Statement #3: Bob Marley (musician) had skin and brain cancer

Statement #4: Sandra Day O’Connor (Supreme Court Justice) had breast cancer.

Statement #5: Fred Rogers (children’s entertainer) had stomach cancer.

Statement #6: Robin Roberts (news anchor) had breast cancer.

Statement #7: Bette Davis (actress) had breast cancer.

Judgment: Breast cancer is the most common type.

Your own experience or memory may also tell you that breast cancer is the most common type. But it is not (although it is common). Actually, skin cancer is the most common type in the US. We make the same types of misjudgments all the time because we do not generate the examples or evidence according to their actual frequencies or probabilities. Instead, we have a tendency (or bias) to search for the examples in memory; if they are easy to retrieve, we assume that they are common. To rephrase this in the language of the heuristic, events seem more likely to the extent that they are available to memory. This bias has been termed the  availability heuristic   (Kahneman and Tversky, 1974).

The fact that we use the availability heuristic does not automatically mean that our judgment is wrong. The reason we use heuristics in the first place is that they work fairly well in many cases (and, of course that they are easy to use). So, the easiest examples to think of sometimes are the most common ones. Is it more likely that a member of the U.S. Senate is a man or a woman? Most people have a much easier time generating examples of male senators. And as it turns out, the U.S. Senate has many more men than women (74 to 26 in 2020). In this case, then, the availability heuristic would lead you to make the correct judgment; it is far more likely that a senator would be a man.

In many other cases, however, the availability heuristic will lead us astray. This is because events can be memorable for many reasons other than their frequency. Section 5.2, Encoding Meaning, suggested that one good way to encode the meaning of some information is to form a mental image of it. Thus, information that has been pictured mentally will be more available to memory. Indeed, an event that is vivid and easily pictured will trick many people into supposing that type of event is more common than it actually is. Repetition of information will also make it more memorable. So, if the same event is described to you in a magazine, on the evening news, on a podcast that you listen to, and in your Facebook feed; it will be very available to memory. Again, the availability heuristic will cause you to misperceive the frequency of these types of events.

Most interestingly, information that is unusual is more memorable. Suppose we give you the following list of words to remember: box, flower, letter, platypus, oven, boat, newspaper, purse, drum, car. Very likely, the easiest word to remember would be platypus, the unusual one. The same thing occurs with memories of events. An event may be available to memory because it is unusual, yet the availability heuristic leads us to judge that the event is common. Did you catch that? In these cases, the availability heuristic makes us think the exact opposite of the true frequency. We end up thinking something is common because it is unusual (and therefore memorable). Yikes.

The misapplication of the availability heuristic sometimes has unfortunate results. For example, if you went to K-12 school in the US over the past 10 years, it is extremely likely that you have participated in lockdown and active shooter drills. Of course, everyone is trying to prevent the tragedy of another school shooting. And believe us, we are not trying to minimize how terrible the tragedy is. But the truth of the matter is, school shootings are extremely rare. Because the federal government does not keep a database of school shootings, the Washington Post has maintained their own running tally. Between 1999 and January 2020 (the date of the most recent school shooting with a death in the US at of the time this paragraph was written), the Post reported a total of 254 people died in school shootings in the US. Not 254 per year, 254 total. That is an average of 12 per year. Of course, that is 254 people who should not have died (particularly because many were children), but in a country with approximately 60,000,000 students and teachers, this is a very small risk.

But many students and teachers are terrified that they will be victims of school shootings because of the availability heuristic. It is so easy to think of examples (they are very available to memory) that people believe the event is very common. It is not. And there is a downside to this. We happen to believe that there is an enormous gun violence problem in the United States. According the the Centers for Disease Control and Prevention, there were 39,773 firearm deaths in the US in 2017. Fifteen of those deaths were in school shootings, according to the Post. 60% of those deaths were suicides. When people pay attention to the school shooting risk (low), they often fail to notice the much larger risk.

And examples like this are by no means unique. The authors of this book have been teaching psychology since the 1990’s. We have been able to make the exact same arguments about the misapplication of the availability heuristics and keep them current by simply swapping out for the “fear of the day.” In the 1990’s it was children being kidnapped by strangers (it was known as “stranger danger”) despite the facts that kidnappings accounted for only 2% of the violent crimes committed against children, and only 24% of kidnappings are committed by strangers (US Department of Justice, 2007). This fear overlapped with the fear of terrorism that gripped the country after the 2001 terrorist attacks on the World Trade Center and US Pentagon and still plagues the population of the US somewhat in 2020. After a well-publicized, sensational act of violence, people are extremely likely to increase their estimates of the chances that they, too, will be victims of terror. Think about the reality, however. In October of 2001, a terrorist mailed anthrax spores to members of the US government and a number of media companies. A total of five people died as a result of this attack. The nation was nearly paralyzed by the fear of dying from the attack; in reality the probability of an individual person dying was 0.00000002.

The availability heuristic can lead you to make incorrect judgments in a school setting as well. For example, suppose you are trying to decide if you should take a class from a particular math professor. You might try to make a judgment of how good a teacher she is by recalling instances of friends and acquaintances making comments about her teaching skill. You may have some examples that suggest that she is a poor teacher very available to memory, so on the basis of the availability heuristic you judge her a poor teacher and decide to take the class from someone else. What if, however, the instances you recalled were all from the same person, and this person happens to be a very colorful storyteller? The subsequent ease of remembering the instances might not indicate that the professor is a poor teacher after all.

Although the availability heuristic is obviously important, it is not the only judgment heuristic we use. Amos Tversky and Daniel Kahneman examined the role of heuristics in inductive reasoning in a long series of studies. Kahneman received a Nobel Prize in Economics for this research in 2002, and Tversky would have certainly received one as well if he had not died of melanoma at age 59 in 1996 (Nobel Prizes are not awarded posthumously). Kahneman and Tversky demonstrated repeatedly that people do not reason in ways that are consistent with the laws of probability. They identified several heuristic strategies that people use instead to make judgments about likelihood. The importance of this work for economics (and the reason that Kahneman was awarded the Nobel Prize) is that earlier economic theories had assumed that people do make judgments rationally, that is, in agreement with the laws of probability.

Another common heuristic that people use for making judgments is the  representativeness heuristic (Kahneman & Tversky 1973). Suppose we describe a person to you. He is quiet and shy, has an unassuming personality, and likes to work with numbers. Is this person more likely to be an accountant or an attorney? If you said accountant, you were probably using the representativeness heuristic. Our imaginary person is judged likely to be an accountant because he resembles, or is representative of the concept of, an accountant. When research participants are asked to make judgments such as these, the only thing that seems to matter is the representativeness of the description. For example, if told that the person described is in a room that contains 70 attorneys and 30 accountants, participants will still assume that he is an accountant.

inductive reasoning :  a type of reasoning in which we make judgments about likelihood from sets of evidence

inductively strong argument :  an inductive argument in which the beginning statements lead to a conclusion that is probably true

heuristic :  a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

availability heuristic :  judging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

representativeness heuristic:   judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

Type 1 thinking : fast, automatic, and emotional thinking.

Type 2 thinking : slow, effortful, and logical thinking.

  • What percentage of workplace homicides are co-worker violence?

Many people get these questions wrong. The answers are 10%; stairs; skin; 6%. How close were your answers? Explain how the availability heuristic might have led you to make the incorrect judgments.

  • Can you think of some other judgments that you have made (or beliefs that you have) that might have been influenced by the availability heuristic?

7.3 Problem Solving

  • Please take a few minutes to list a number of problems that you are facing right now.
  • Now write about a problem that you recently solved.
  • What is your definition of a problem?

Mary has a problem. Her daughter, ordinarily quite eager to please, appears to delight in being the last person to do anything. Whether getting ready for school, going to piano lessons or karate class, or even going out with her friends, she seems unwilling or unable to get ready on time. Other people have different kinds of problems. For example, many students work at jobs, have numerous family commitments, and are facing a course schedule full of difficult exams, assignments, papers, and speeches. How can they find enough time to devote to their studies and still fulfill their other obligations? Speaking of students and their problems: Show that a ball thrown vertically upward with initial velocity v0 takes twice as much time to return as to reach the highest point (from Spiegel, 1981).

These are three very different situations, but we have called them all problems. What makes them all the same, despite the differences? A psychologist might define a  problem   as a situation with an initial state, a goal state, and a set of possible intermediate states. Somewhat more meaningfully, we might consider a problem a situation in which you are in here one state (e.g., daughter is always late), you want to be there in another state (e.g., daughter is not always late), and with no obvious way to get from here to there. Defined this way, each of the three situations we outlined can now be seen as an example of the same general concept, a problem. At this point, you might begin to wonder what is not a problem, given such a general definition. It seems that nearly every non-routine task we engage in could qualify as a problem. As long as you realize that problems are not necessarily bad (it can be quite fun and satisfying to rise to the challenge and solve a problem), this may be a useful way to think about it.

Can we identify a set of problem-solving skills that would apply to these very different kinds of situations? That task, in a nutshell, is a major goal of this section. Let us try to begin to make sense of the wide variety of ways that problems can be solved with an important observation: the process of solving problems can be divided into two key parts. First, people have to notice, comprehend, and represent the problem properly in their minds (called  problem representation ). Second, they have to apply some kind of solution strategy to the problem. Psychologists have studied both of these key parts of the process in detail.

When you first think about the problem-solving process, you might guess that most of our difficulties would occur because we are failing in the second step, the application of strategies. Although this can be a significant difficulty much of the time, the more important source of difficulty is probably problem representation. In short, we often fail to solve a problem because we are looking at it, or thinking about it, the wrong way.

problem :  a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

problem representation :  noticing, comprehending and forming a mental conception of a problem

Defining and Mentally Representing Problems in Order to Solve Them

So, the main obstacle to solving a problem is that we do not clearly understand exactly what the problem is. Recall the problem with Mary’s daughter always being late. One way to represent, or to think about, this problem is that she is being defiant. She refuses to get ready in time. This type of representation or definition suggests a particular type of solution. Another way to think about the problem, however, is to consider the possibility that she is simply being sidetracked by interesting diversions. This different conception of what the problem is (i.e., different representation) suggests a very different solution strategy. For example, if Mary defines the problem as defiance, she may be tempted to solve the problem using some kind of coercive tactics, that is, to assert her authority as her mother and force her to listen. On the other hand, if Mary defines the problem as distraction, she may try to solve it by simply removing the distracting objects.

As you might guess, when a problem is represented one way, the solution may seem very difficult, or even impossible. Seen another way, the solution might be very easy. For example, consider the following problem (from Nasar, 1998):

Two bicyclists start 20 miles apart and head toward each other, each going at a steady rate of 10 miles per hour. At the same time, a fly that travels at a steady 15 miles per hour starts from the front wheel of the southbound bicycle and flies to the front wheel of the northbound one, then turns around and flies to the front wheel of the southbound one again, and continues in this manner until he is crushed between the two front wheels. Question: what total distance did the fly cover?

Please take a few minutes to try to solve this problem.

Most people represent this problem as a question about a fly because, well, that is how the question is asked. The solution, using this representation, is to figure out how far the fly travels on the first leg of its journey, then add this total to how far it travels on the second leg of its journey (when it turns around and returns to the first bicycle), then continue to add the smaller distance from each leg of the journey until you converge on the correct answer. You would have to be quite skilled at math to solve this problem, and you would probably need some time and pencil and paper to do it.

If you consider a different representation, however, you can solve this problem in your head. Instead of thinking about it as a question about a fly, think about it as a question about the bicycles. They are 20 miles apart, and each is traveling 10 miles per hour. How long will it take for the bicycles to reach each other? Right, one hour. The fly is traveling 15 miles per hour; therefore, it will travel a total of 15 miles back and forth in the hour before the bicycles meet. Represented one way (as a problem about a fly), the problem is quite difficult. Represented another way (as a problem about two bicycles), it is easy. Changing your representation of a problem is sometimes the best—sometimes the only—way to solve it.

Unfortunately, however, changing a problem’s representation is not the easiest thing in the world to do. Often, problem solvers get stuck looking at a problem one way. This is called  fixation . Most people who represent the preceding problem as a problem about a fly probably do not pause to reconsider, and consequently change, their representation. A parent who thinks her daughter is being defiant is unlikely to consider the possibility that her behavior is far less purposeful.

Problem-solving fixation was examined by a group of German psychologists called Gestalt psychologists during the 1930’s and 1940’s. Karl Dunker, for example, discovered an important type of failure to take a different perspective called  functional fixedness . Imagine being a participant in one of his experiments. You are asked to figure out how to mount two candles on a door and are given an assortment of odds and ends, including a small empty cardboard box and some thumbtacks. Perhaps you have already figured out a solution: tack the box to the door so it forms a platform, then put the candles on top of the box. Most people are able to arrive at this solution. Imagine a slight variation of the procedure, however. What if, instead of being empty, the box had matches in it? Most people given this version of the problem do not arrive at the solution given above. Why? Because it seems to people that when the box contains matches, it already has a function; it is a matchbox. People are unlikely to consider a new function for an object that already has a function. This is functional fixedness.

Mental set is a type of fixation in which the problem solver gets stuck using the same solution strategy that has been successful in the past, even though the solution may no longer be useful. It is commonly seen when students do math problems for homework. Often, several problems in a row require the reapplication of the same solution strategy. Then, without warning, the next problem in the set requires a new strategy. Many students attempt to apply the formerly successful strategy on the new problem and therefore cannot come up with a correct answer.

The thing to remember is that you cannot solve a problem unless you correctly identify what it is to begin with (initial state) and what you want the end result to be (goal state). That may mean looking at the problem from a different angle and representing it in a new way. The correct representation does not guarantee a successful solution, but it certainly puts you on the right track.

A bit more optimistically, the Gestalt psychologists discovered what may be considered the opposite of fixation, namely  insight . Sometimes the solution to a problem just seems to pop into your head. Wolfgang Kohler examined insight by posing many different problems to chimpanzees, principally problems pertaining to their acquisition of out-of-reach food. In one version, a banana was placed outside of a chimpanzee’s cage and a short stick inside the cage. The stick was too short to retrieve the banana, but was long enough to retrieve a longer stick also located outside of the cage. This second stick was long enough to retrieve the banana. After trying, and failing, to reach the banana with the shorter stick, the chimpanzee would try a couple of random-seeming attempts, react with some apparent frustration or anger, then suddenly rush to the longer stick, the correct solution fully realized at this point. This sudden appearance of the solution, observed many times with many different problems, was termed insight by Kohler.

Lest you think it pertains to chimpanzees only, Karl Dunker demonstrated that children also solve problems through insight in the 1930s. More importantly, you have probably experienced insight yourself. Think back to a time when you were trying to solve a difficult problem. After struggling for a while, you gave up. Hours later, the solution just popped into your head, perhaps when you were taking a walk, eating dinner, or lying in bed.

fixation :  when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

functional fixedness :  a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

mental set :  a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

insight :  a sudden realization of a solution to a problem

Solving Problems by Trial and Error

Correctly identifying the problem and your goal for a solution is a good start, but recall the psychologist’s definition of a problem: it includes a set of possible intermediate states. Viewed this way, a problem can be solved satisfactorily only if one can find a path through some of these intermediate states to the goal. Imagine a fairly routine problem, finding a new route to school when your ordinary route is blocked (by road construction, for example). At each intersection, you may turn left, turn right, or go straight. A satisfactory solution to the problem (of getting to school) is a sequence of selections at each intersection that allows you to wind up at school.

If you had all the time in the world to get to school, you might try choosing intermediate states randomly. At one corner you turn left, the next you go straight, then you go left again, then right, then right, then straight. Unfortunately, trial and error will not necessarily get you where you want to go, and even if it does, it is not the fastest way to get there. For example, when a friend of ours was in college, he got lost on the way to a concert and attempted to find the venue by choosing streets to turn onto randomly (this was long before the use of GPS). Amazingly enough, the strategy worked, although he did end up missing two out of the three bands who played that night.

Trial and error is not all bad, however. B.F. Skinner, a prominent behaviorist psychologist, suggested that people often behave randomly in order to see what effect the behavior has on the environment and what subsequent effect this environmental change has on them. This seems particularly true for the very young person. Picture a child filling a household’s fish tank with toilet paper, for example. To a child trying to develop a repertoire of creative problem-solving strategies, an odd and random behavior might be just the ticket. Eventually, the exasperated parent hopes, the child will discover that many of these random behaviors do not successfully solve problems; in fact, in many cases they create problems. Thus, one would expect a decrease in this random behavior as a child matures. You should realize, however, that the opposite extreme is equally counterproductive. If the children become too rigid, never trying something unexpected and new, their problem solving skills can become too limited.

Effective problem solving seems to call for a happy medium that strikes a balance between using well-founded old strategies and trying new ground and territory. The individual who recognizes a situation in which an old problem-solving strategy would work best, and who can also recognize a situation in which a new untested strategy is necessary is halfway to success.

Solving Problems with Algorithms and Heuristics

For many problems there is a possible strategy available that will guarantee a correct solution. For example, think about math problems. Math lessons often consist of step-by-step procedures that can be used to solve the problems. If you apply the strategy without error, you are guaranteed to arrive at the correct solution to the problem. This approach is called using an  algorithm , a term that denotes the step-by-step procedure that guarantees a correct solution. Because algorithms are sometimes available and come with a guarantee, you might think that most people use them frequently. Unfortunately, however, they do not. As the experience of many students who have struggled through math classes can attest, algorithms can be extremely difficult to use, even when the problem solver knows which algorithm is supposed to work in solving the problem. In problems outside of math class, we often do not even know if an algorithm is available. It is probably fair to say, then, that algorithms are rarely used when people try to solve problems.

Because algorithms are so difficult to use, people often pass up the opportunity to guarantee a correct solution in favor of a strategy that is much easier to use and yields a reasonable chance of coming up with a correct solution. These strategies are called  problem solving heuristics . Similar to what you saw in section 6.2 with reasoning heuristics, a problem solving heuristic is a shortcut strategy that people use when trying to solve problems. It usually works pretty well, but does not guarantee a correct solution to the problem. For example, one problem solving heuristic might be “always move toward the goal” (so when trying to get to school when your regular route is blocked, you would always turn in the direction you think the school is). A heuristic that people might use when doing math homework is “use the same solution strategy that you just used for the previous problem.”

By the way, we hope these last two paragraphs feel familiar to you. They seem to parallel a distinction that you recently learned. Indeed, algorithms and problem-solving heuristics are another example of the distinction between Type 1 thinking and Type 2 thinking.

Although it is probably not worth describing a large number of specific heuristics, two observations about heuristics are worth mentioning. First, heuristics can be very general or they can be very specific, pertaining to a particular type of problem only. For example, “always move toward the goal” is a general strategy that you can apply to countless problem situations. On the other hand, “when you are lost without a functioning gps, pick the most expensive car you can see and follow it” is specific to the problem of being lost. Second, all heuristics are not equally useful. One heuristic that many students know is “when in doubt, choose c for a question on a multiple-choice exam.” This is a dreadful strategy because many instructors intentionally randomize the order of answer choices. Another test-taking heuristic, somewhat more useful, is “look for the answer to one question somewhere else on the exam.”

You really should pay attention to the application of heuristics to test taking. Imagine that while reviewing your answers for a multiple-choice exam before turning it in, you come across a question for which you originally thought the answer was c. Upon reflection, you now think that the answer might be b. Should you change the answer to b, or should you stick with your first impression? Most people will apply the heuristic strategy to “stick with your first impression.” What they do not realize, of course, is that this is a very poor strategy (Lilienfeld et al, 2009). Most of the errors on exams come on questions that were answered wrong originally and were not changed (so they remain wrong). There are many fewer errors where we change a correct answer to an incorrect answer. And, of course, sometimes we change an incorrect answer to a correct answer. In fact, research has shown that it is more common to change a wrong answer to a right answer than vice versa (Bruno, 2001).

The belief in this poor test-taking strategy (stick with your first impression) is based on the  confirmation bias   (Nickerson, 1998; Wason, 1960). You first saw the confirmation bias in Module 1, but because it is so important, we will repeat the information here. People have a bias, or tendency, to notice information that confirms what they already believe. Somebody at one time told you to stick with your first impression, so when you look at the results of an exam you have taken, you will tend to notice the cases that are consistent with that belief. That is, you will notice the cases in which you originally had an answer correct and changed it to the wrong answer. You tend not to notice the other two important (and more common) cases, changing an answer from wrong to right, and leaving a wrong answer unchanged.

Because heuristics by definition do not guarantee a correct solution to a problem, mistakes are bound to occur when we employ them. A poor choice of a specific heuristic will lead to an even higher likelihood of making an error.

algorithm :  a step-by-step procedure that guarantees a correct solution to a problem

problem solving heuristic :  a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

confirmation bias :  people’s tendency to notice information that confirms what they already believe

An Effective Problem-Solving Sequence

You may be left with a big question: If algorithms are hard to use and heuristics often don’t work, how am I supposed to solve problems? Robert Sternberg (1996), as part of his theory of what makes people successfully intelligent (Module 8) described a problem-solving sequence that has been shown to work rather well:

  • Identify the existence of a problem.  In school, problem identification is often easy; problems that you encounter in math classes, for example, are conveniently labeled as problems for you. Outside of school, however, realizing that you have a problem is a key difficulty that you must get past in order to begin solving it. You must be very sensitive to the symptoms that indicate a problem.
  • Define the problem.  Suppose you realize that you have been having many headaches recently. Very likely, you would identify this as a problem. If you define the problem as “headaches,” the solution would probably be to take aspirin or ibuprofen or some other anti-inflammatory medication. If the headaches keep returning, however, you have not really solved the problem—likely because you have mistaken a symptom for the problem itself. Instead, you must find the root cause of the headaches. Stress might be the real problem. For you to successfully solve many problems it may be necessary for you to overcome your fixations and represent the problems differently. One specific strategy that you might find useful is to try to define the problem from someone else’s perspective. How would your parents, spouse, significant other, doctor, etc. define the problem? Somewhere in these different perspectives may lurk the key definition that will allow you to find an easier and permanent solution.
  • Formulate strategy.  Now it is time to begin planning exactly how the problem will be solved. Is there an algorithm or heuristic available for you to use? Remember, heuristics by their very nature guarantee that occasionally you will not be able to solve the problem. One point to keep in mind is that you should look for long-range solutions, which are more likely to address the root cause of a problem than short-range solutions.
  • Represent and organize information.  Similar to the way that the problem itself can be defined, or represented in multiple ways, information within the problem is open to different interpretations. Suppose you are studying for a big exam. You have chapters from a textbook and from a supplemental reader, along with lecture notes that all need to be studied. How should you (represent and) organize these materials? Should you separate them by type of material (text versus reader versus lecture notes), or should you separate them by topic? To solve problems effectively, you must learn to find the most useful representation and organization of information.
  • Allocate resources.  This is perhaps the simplest principle of the problem solving sequence, but it is extremely difficult for many people. First, you must decide whether time, money, skills, effort, goodwill, or some other resource would help to solve the problem Then, you must make the hard choice of deciding which resources to use, realizing that you cannot devote maximum resources to every problem. Very often, the solution to problem is simply to change how resources are allocated (for example, spending more time studying in order to improve grades).
  • Monitor and evaluate solutions.  Pay attention to the solution strategy while you are applying it. If it is not working, you may be able to select another strategy. Another fact you should realize about problem solving is that it never does end. Solving one problem frequently brings up new ones. Good monitoring and evaluation of your problem solutions can help you to anticipate and get a jump on solving the inevitable new problems that will arise.

Please note that this as  an  effective problem-solving sequence, not  the  effective problem solving sequence. Just as you can become fixated and end up representing the problem incorrectly or trying an inefficient solution, you can become stuck applying the problem-solving sequence in an inflexible way. Clearly there are problem situations that can be solved without using these skills in this order.

Additionally, many real-world problems may require that you go back and redefine a problem several times as the situation changes (Sternberg et al. 2000). For example, consider the problem with Mary’s daughter one last time. At first, Mary did represent the problem as one of defiance. When her early strategy of pleading and threatening punishment was unsuccessful, Mary began to observe her daughter more carefully. She noticed that, indeed, her daughter’s attention would be drawn by an irresistible distraction or book. Fresh with a re-representation of the problem, she began a new solution strategy. She began to remind her daughter every few minutes to stay on task and remind her that if she is ready before it is time to leave, she may return to the book or other distracting object at that time. Fortunately, this strategy was successful, so Mary did not have to go back and redefine the problem again.

Pick one or two of the problems that you listed when you first started studying this section and try to work out the steps of Sternberg’s problem solving sequence for each one.

a mental representation of a category of things in the world

an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

knowledge about one’s own cognitive processes; thinking about your thinking

individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

Thinking like a scientist in your everyday life for the purpose of drawing correct conclusions. It entails skepticism; an ability to identify biases, distortions, omissions, and assumptions; and excellent deductive and inductive reasoning, and problem solving skills.

a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

an inclination, tendency, leaning, or prejudice

a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

a set of statements in which the beginning statements lead to a conclusion

an argument for which true beginning statements guarantee that the conclusion is true

a type of reasoning in which we make judgments about likelihood from sets of evidence

an inductive argument in which the beginning statements lead to a conclusion that is probably true

fast, automatic, and emotional thinking

slow, effortful, and logical thinking

a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

udging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

noticing, comprehending and forming a mental conception of a problem

when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

a sudden realization of a solution to a problem

a step-by-step procedure that guarantees a correct solution to a problem

The tendency to notice and pay attention to information that confirms your prior beliefs and to ignore information that disconfirms them.

a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

Introduction to Psychology Copyright © 2020 by Ken Gray; Elizabeth Arnott-Hill; and Or'Shaundra Benson is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Overview of the Problem-Solving Mental Process

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

human thinking and problem solving

Rachel Goldman, PhD FTOS, is a licensed psychologist, clinical assistant professor, speaker, wellness expert specializing in eating behaviors, stress management, and health behavior change.

human thinking and problem solving

  • Identify the Problem
  • Define the Problem
  • Form a Strategy
  • Organize Information
  • Allocate Resources
  • Monitor Progress
  • Evaluate the Results

Frequently Asked Questions

Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue.

The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything they can about the issue and then using factual knowledge to come up with a solution. In other instances, creativity and insight are the best options.

It is not necessary to follow problem-solving steps sequentially, It is common to skip steps or even go back through steps multiple times until the desired solution is reached.

In order to correctly solve a problem, it is often important to follow a series of steps. Researchers sometimes refer to this as the problem-solving cycle. While this cycle is portrayed sequentially, people rarely follow a rigid series of steps to find a solution.

The following steps include developing strategies and organizing knowledge.

1. Identifying the Problem

While it may seem like an obvious step, identifying the problem is not always as simple as it sounds. In some cases, people might mistakenly identify the wrong source of a problem, which will make attempts to solve it inefficient or even useless.

Some strategies that you might use to figure out the source of a problem include :

  • Asking questions about the problem
  • Breaking the problem down into smaller pieces
  • Looking at the problem from different perspectives
  • Conducting research to figure out what relationships exist between different variables

2. Defining the Problem

After the problem has been identified, it is important to fully define the problem so that it can be solved. You can define a problem by operationally defining each aspect of the problem and setting goals for what aspects of the problem you will address

At this point, you should focus on figuring out which aspects of the problems are facts and which are opinions. State the problem clearly and identify the scope of the solution.

3. Forming a Strategy

After the problem has been identified, it is time to start brainstorming potential solutions. This step usually involves generating as many ideas as possible without judging their quality. Once several possibilities have been generated, they can be evaluated and narrowed down.

The next step is to develop a strategy to solve the problem. The approach used will vary depending upon the situation and the individual's unique preferences. Common problem-solving strategies include heuristics and algorithms.

  • Heuristics are mental shortcuts that are often based on solutions that have worked in the past. They can work well if the problem is similar to something you have encountered before and are often the best choice if you need a fast solution.
  • Algorithms are step-by-step strategies that are guaranteed to produce a correct result. While this approach is great for accuracy, it can also consume time and resources.

Heuristics are often best used when time is of the essence, while algorithms are a better choice when a decision needs to be as accurate as possible.

4. Organizing Information

Before coming up with a solution, you need to first organize the available information. What do you know about the problem? What do you not know? The more information that is available the better prepared you will be to come up with an accurate solution.

When approaching a problem, it is important to make sure that you have all the data you need. Making a decision without adequate information can lead to biased or inaccurate results.

5. Allocating Resources

Of course, we don't always have unlimited money, time, and other resources to solve a problem. Before you begin to solve a problem, you need to determine how high priority it is.

If it is an important problem, it is probably worth allocating more resources to solving it. If, however, it is a fairly unimportant problem, then you do not want to spend too much of your available resources on coming up with a solution.

At this stage, it is important to consider all of the factors that might affect the problem at hand. This includes looking at the available resources, deadlines that need to be met, and any possible risks involved in each solution. After careful evaluation, a decision can be made about which solution to pursue.

6. Monitoring Progress

After selecting a problem-solving strategy, it is time to put the plan into action and see if it works. This step might involve trying out different solutions to see which one is the most effective.

It is also important to monitor the situation after implementing a solution to ensure that the problem has been solved and that no new problems have arisen as a result of the proposed solution.

Effective problem-solvers tend to monitor their progress as they work towards a solution. If they are not making good progress toward reaching their goal, they will reevaluate their approach or look for new strategies .

7. Evaluating the Results

After a solution has been reached, it is important to evaluate the results to determine if it is the best possible solution to the problem. This evaluation might be immediate, such as checking the results of a math problem to ensure the answer is correct, or it can be delayed, such as evaluating the success of a therapy program after several months of treatment.

Once a problem has been solved, it is important to take some time to reflect on the process that was used and evaluate the results. This will help you to improve your problem-solving skills and become more efficient at solving future problems.

A Word From Verywell​

It is important to remember that there are many different problem-solving processes with different steps, and this is just one example. Problem-solving in real-world situations requires a great deal of resourcefulness, flexibility, resilience, and continuous interaction with the environment.

Get Advice From The Verywell Mind Podcast

Hosted by therapist Amy Morin, LCSW, this episode of The Verywell Mind Podcast shares how you can stop dwelling in a negative mindset.

Follow Now : Apple Podcasts / Spotify / Google Podcasts

You can become a better problem solving by:

  • Practicing brainstorming and coming up with multiple potential solutions to problems
  • Being open-minded and considering all possible options before making a decision
  • Breaking down problems into smaller, more manageable pieces
  • Asking for help when needed
  • Researching different problem-solving techniques and trying out new ones
  • Learning from mistakes and using them as opportunities to grow

It's important to communicate openly and honestly with your partner about what's going on. Try to see things from their perspective as well as your own. Work together to find a resolution that works for both of you. Be willing to compromise and accept that there may not be a perfect solution.

Take breaks if things are getting too heated, and come back to the problem when you feel calm and collected. Don't try to fix every problem on your own—consider asking a therapist or counselor for help and insight.

If you've tried everything and there doesn't seem to be a way to fix the problem, you may have to learn to accept it. This can be difficult, but try to focus on the positive aspects of your life and remember that every situation is temporary. Don't dwell on what's going wrong—instead, think about what's going right. Find support by talking to friends or family. Seek professional help if you're having trouble coping.

Davidson JE, Sternberg RJ, editors.  The Psychology of Problem Solving .  Cambridge University Press; 2003. doi:10.1017/CBO9780511615771

Sarathy V. Real world problem-solving .  Front Hum Neurosci . 2018;12:261. Published 2018 Jun 26. doi:10.3389/fnhum.2018.00261

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Logo for Drake University Pressbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 Thinking, Language, and Problem Solving

Three different artistic portrayals of a person in thought are shown. From left to right, a painting of a woman with an open book, a sculpture of a man hunched over, head on chin, and a ink painting of a man sitting cross-legged holding his head.

What is the best way to solve a problem? How does a person who has never seen or touched snow in real life develop an understanding of the concept of snow? How do young children acquire the ability to learn language with no formal instruction? Psychologists who study thinking explore questions like these and are called cognitive psychologists.

In other chapters, we discussed the cognitive processes of perception, learning, and memory. In this chapter, we will focus on high-level cognitive processes. As a part of this discussion, we will consider thinking and briefly explore the development and use of language. We will also discuss problem solving and creativity. After finishing this chapter, you will have a greater appreciation of the higher-level cognitive processes that contribute to our distinctiveness as a species.

Table of Contents

7.1 What is Cognition? 7.2 Language 7.3 Problem Solving

7.1 What is Cognition?

Learning Objectives

By the end of this section, you will be able to:

  • Describe cognition
  • Distinguish concepts and prototypes
  • Explain the difference between natural and artificial concepts
  • Describe how schemata are organized and constructed

Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. This is only one facet of the complex processes involved in cognition . Simply put,  cognition  is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

Upon waking each morning, you begin thinking—contemplating the tasks that you must complete that day. In what order should you run your errands? Should you go to the bank, the cleaners, or the grocery store first? Can you get these things done before you head to class or will they need to wait until school is done? These thoughts are one example of cognition at work. Exceptionally complex, cognition is an essential feature of human consciousness, yet not all aspects of cognition are consciously experienced.

Cognitive psychology  is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes. Cognitive psychologists strive to determine and measure different types of intelligence, why some people are better at problem solving than others, and how emotional intelligence affects success in the workplace, among countless other topics. They also sometimes focus on how we organize thoughts and information gathered from our environments into meaningful categories of thought, which will be discussed later.

Concepts and Prototypes

The human nervous system is capable of handling endless streams of information. The senses serve as the interface between the mind and the external environment, receiving stimuli and translating it into nervous impulses that are transmitted to the brain. The brain then processes this information and uses the relevant pieces to create thoughts, which can then be expressed through language or stored in memory for future use. To make this process more complex, the brain does not gather information from external environments only. When thoughts are formed, the mind synthesizes information from emotions and memories ( Figure 7.2 ). Emotion and memory are powerful influences on both our thoughts and behaviors.

A flow chart is overlaid on a drawing of a head with a ponytail. The flowchart reads: Information, sensations (arrow) emotions, memories (arrow) thoughts (arrow) behavior. Thoughts is also connected to Emotions, memories via a feedback arrow.

Concepts are informed by our semantic memory (you will learn more about semantic memory in a later chapter) and are present in every aspect of our lives; however, one of the easiest places to notice concepts is inside a classroom, where they are discussed explicitly. When you study United States history, for example, you learn about more than just individual events that have happened in America’s past. You absorb a large quantity of information by listening to and participating in discussions, examining maps, and reading first-hand accounts of people’s lives. Your brain analyzes these details and develops an overall understanding of American history. In the process, your brain gathers details that inform and refine your understanding of related concepts like democracy, power, and freedom.

Concepts can be complex and abstract, like justice, or more concrete, like types of birds. Some concepts, like tolerance, are agreed upon by many people, because they have been used in various ways over many years. Other concepts, like the characteristics of your ideal friend or your family’s birthday traditions, are personal and individualized. In this way, concepts touch every aspect of our lives, from our many daily routines to the guiding principles behind the way governments function.

Another technique used by your brain to organize information is the identification of prototypes for the concepts you have developed. A  prototype  is the best example or representation of a concept. For example, what comes to your mind when you think of a dog? Most likely your early experiences with dogs will shape what you imagine. If your first pet was a Golden Retriever, there is a good chance that this would be your prototype for the category of dogs.

Natural and Artificial Concepts

In psychology, concepts can be divided into two categories, natural and artificial. Natural concepts  are created “naturally” through your experiences and can be developed from either direct or indirect experiences. For example, if you live in Essex Junction, Vermont, you have probably had a lot of direct experience with snow. You’ve watched it fall from the sky, you’ve seen lightly falling snow that barely covers the windshield of your car, and you’ve shoveled out 18 inches of fluffy white snow as you’ve thought, “This is perfect for skiing.” You’ve thrown snowballs at your best friend and gone sledding down the steepest hill in town. In short, you know snow. You know what it looks like, smells like, tastes like, and feels like. If, however, you’ve lived your whole life on the island of Saint Vincent in the Caribbean, you may never have actually seen snow, much less tasted, smelled, or touched it. You know snow from the indirect experience of seeing pictures of falling snow—or from watching films that feature snow as part of the setting. Either way, snow is a natural concept because you can construct an understanding of it through direct observations, experiences with snow, or indirect knowledge (such as from films or books) ( Figure 7.3 ).

Two images labeled a and b. A depicts a snowy field on a sunny day. B depicts a sphere, rectangular prism, and triangular prism.

An  artificial concept , on the other hand, is a concept that is defined by a specific set of characteristics. Various properties of geometric shapes, like squares and triangles, serve as useful examples of artificial concepts. A triangle always has three angles and three sides. A square always has four equal sides and four right angles. Mathematical formulas, like the equation for area (length × width) are artificial concepts defined by specific sets of characteristics that are always the same. Artificial concepts can enhance the understanding of a topic by building on one another. For example, before learning the concept of “area of a square” (and the formula to find it), you must understand what a square is. Once the concept of “area of a square” is understood, an understanding of area for other geometric shapes can be built upon the original understanding of area. The use of artificial concepts to define an idea is crucial to communicating with others and engaging in complex thought. According to Goldstone and Kersten (2003), concepts act as building blocks and can be connected in countless combinations to create complex thoughts.

A  schema (plural: schemata)  is a mental construct consisting of a cluster or collection of related concepts (Bartlett, 1932). There are many different types of schemata, and they all have one thing in common: schemata are a method of organizing information that allows the brain to work more efficiently. When a schema is activated, the brain makes immediate assumptions about the person or object being observed.

There are several types of schemata. A  role schema  makes assumptions about how individuals in certain roles will behave (Callero, 1994). For example, imagine you meet someone who introduces himself as a firefighter. When this happens, your brain automatically activates the “firefighter schema” and begins making assumptions that this person is brave, selfless, and community-oriented. Despite not knowing this person, already you have unknowingly made judgments about him. Schemata also help you fill in gaps in the information you receive from the world around you. While schemata allow for more efficient information processing, there can be problems with schemata, regardless of whether they are accurate: Perhaps this particular firefighter is not brave, he just works as a firefighter to pay the bills while studying to become a children’s librarian.

An  event schema , also known as a  cognitive script , is a set of behaviors that can feel like a routine. Think about what you do when you walk into an elevator ( Figure 7.4 ). First, the doors open and you wait to let exiting passengers leave the elevator car. Then, you step into the elevator and turn around to face the doors, looking for the correct button to push. You never face the back of the elevator, do you? And when you’re riding in a crowded elevator and you can’t face the front, it feels uncomfortable, doesn’t it? Interestingly, event schemata can vary widely among different cultures and countries. For example, while it is quite common for people to greet one another with a handshake in the United States, in Tibet, you greet someone by sticking your tongue out at them, and in Belize, you bump fists (Cairns Regional Council, n.d.)

A crowded elevator.

Because event schemata are automatic, they can be difficult to change. Imagine that you are driving home from work or school. This event schema involves getting in the car, shutting the door, and buckling your seatbelt before putting the key in the ignition. You might perform this script two or three times each day. As you drive home, you hear your phone’s ring tone. Typically, the event schema that occurs when you hear your phone ringing involves locating the phone and answering it or responding to your latest text message. So without thinking, you reach for your phone, which could be in your pocket, in your bag, or on the passenger seat of the car. This powerful event schema is informed by your pattern of behavior and the pleasurable stimulation that a phone call or text message gives your brain. Because it is a schema, it is extremely challenging for us to stop reaching for the phone, even though we know that we endanger our own lives and the lives of others while we do it (Neyfakh, 2013) ( Figure 7.5 ).

A hand holds a cellphone in front of a steering wheel and front-shield window of a car. The car is on a road.

Remember the elevator? It feels almost impossible to walk in and  not  face the door. Our powerful event schema dictates our behavior in the elevator, and it is no different with our phones. Current research suggests that it is the habit, or event schema, of checking our phones in many different situations that makes refraining from checking them while driving especially difficult (Bayer & Campbell, 2012). Because texting and driving has become a dangerous epidemic in recent years, psychologists are looking at ways to help people interrupt the “phone schema” while driving. Event schemata like these are the reason why many habits are difficult to break once they have been acquired. As we continue to examine thinking, keep in mind how powerful the forces of concepts and schemata are to our understanding of the world.

7.2 LAnguage

  • Define language and demonstrate familiarity with the components of language
  • Understand the development of language
  • Explain the relationship between language and thinking

Language  is a communication system that involves using words and systematic rules to organize those words to transmit information from one individual to another. While language is a form of communication, not all communication is language. Many species communicate with one another through their postures, movements, odors, or vocalizations. This communication is crucial for species that need to interact and develop social relationships with their conspecifics. However, many people have asserted that it is language that makes humans unique among all of the animal species (Corballis & Suddendorf, 2007; Tomasello & Rakoczy, 2003). This section will focus on what distinguishes language as a special form of communication, how the use of language develops, and how language affects the way we think.

Components of Language

Language, be it spoken, signed, or written, has specific components: a lexicon and lexicon grammar .  Lexicon  refers to the words of a given language. Thus, lexicon is a language’s vocabulary.  Grammar  refers to the set of rules that are used to convey meaning through the use of the lexicon (Fernández & Cairns, 2011). For instance, English grammar dictates that most verbs receive an “-ed” at the end to indicate past tense.

Words are formed by combining the various phonemes that make up the language. A  phoneme  (e.g., the sounds “ah” vs. “eh”) is a basic sound unit of a given language, and different languages have different sets of phonemes. For example, the phoneme English speakers associate with the letter ‘L’ is not used in the Japanese language. Similarly, many Southern African languages use phonemes, sometimes referred to as ‘click consonants’ that are not used in English.

Phonemes are combined to form  morphemes , which are the smallest units of language that convey some type of meaning. Some words are morphemes, but not all morphemes are words.  For example, “-ed” is a morpheme used to convey the past-tense in English, but it is not a word. The word “review” contains two morphemes: re- (meaning to do something again) and view (to see). Finally, some words like “I” and “a” are both a phonemes and morphemes.

We use semantics and syntax to construct language. Semantics and syntax are part of a language’s grammar.  Semantics  refers to the process by which we derive meaning from morphemes and words by connecting those morphemes and words to stored concepts.  Syntax  refers to the way words are organized into sentences (Chomsky, 1965; Fernández & Cairns, 2011). For example, you would never say “the dog walked I today” to let someone know you took your dog for a walk–that sentence does not obey English syntax and is therefore difficult to make sense of.

We apply the rules of grammar to organize the lexicon in novel and creative ways, which allow us to communicate information about both concrete and abstract concepts. We can talk about our immediate and observable surroundings as well as the surface of unseen planets. We can share our innermost thoughts, our plans for the future, and debate the value of a college education. We can provide detailed instructions for cooking a meal, fixing a car, or building a fire. Through our use of words and language, we are able to form, organize, and express ideas, schema, and artificial concepts.

Language Development

Given the remarkable complexity of a language, one might expect that mastering a language would be an especially arduous task; indeed, for those of us trying to learn a second language as adults, this might seem to be true. However, young children master language very quickly with relative ease. B. F.  Skinner  (1957) proposed that language is learned through reinforcement. Noam  Chomsky  (1965) criticized this behaviorist approach, asserting instead that the mechanisms underlying language acquisition are biologically determined. The use of language develops in the absence of formal instruction and appears to follow a very similar pattern in children from vastly different cultures and backgrounds. It would seem, therefore, that we are born with a biological predisposition to acquire a language (Chomsky, 1965; Fernández & Cairns, 2011). Moreover, it appears that there is a critical period for language acquisition, such that this proficiency at acquiring language is maximal early in life; generally, as people age, the ease with which they acquire and master new languages diminishes (Johnson & Newport, 1989; Lenneberg, 1967; Singleton, 1995).

Children begin to learn about language from a very early age ( Table 7.1 ). In fact, it appears that this is occurring even before we are born. Newborns show preference for their mother’s voice and appear to be able to discriminate between the language spoken by their mother and other languages. Babies are also attuned to the languages being used around them and show preferences for videos of faces that are moving in synchrony with the audio of spoken language versus videos that do not synchronize with the audio (Blossom & Morgan, 2006; Pickens, 1994; Spelke & Cortelyou, 1981).

DIG DEEPER: The Case of Genie

In the fall of 1970, a social worker in the Los Angeles area found a 13-year-old girl who was being raised in extremely neglectful and abusive conditions. The girl, who came to be known as Genie, had lived most of her life tied to a potty chair or confined to a crib in a small room that was kept closed with the curtains drawn. For a little over a decade, Genie had virtually no social interaction and no access to the outside world. As a result of these conditions, Genie was unable to stand up, chew solid food, or speak (Fromkin, Krashen, Curtiss, Rigler, & Rigler, 1974; Rymer, 1993). The police took Genie into protective custody.

Genie’s abilities improved dramatically following her removal from her abusive environment, and early on, it appeared she was acquiring language—much later than would be predicted by critical period hypotheses that had been posited at the time (Fromkin et al., 1974). Genie managed to amass an impressive vocabulary in a relatively short amount of time. However, she never developed a mastery of the grammatical aspects of language (Curtiss, 1981). Perhaps being deprived of the opportunity to learn language during a critical period impeded Genie’s ability to fully acquire and use language.

You may recall that each language has its own set of phonemes that are used to generate morphemes, words, and so on. Babies can discriminate among the sounds that make up a language (for example, they can tell the difference between the “s” in vision and the “ss” in fission); early on, they can differentiate between the sounds of all human languages, even those that do not occur in the languages that are used in their environments. However, by the time that they are about 1 year old, they can only discriminate among those phonemes that are used in the language or languages in their environments (Jensen, 2011; Werker & Lalonde, 1988; Werker & Tees, 1984).

After the first few months of life, babies enter what is known as the babbling stage, during which time they tend to produce single syllables that are repeated over and over. As time passes, more variations appear in the syllables that they produce. During this time, it is unlikely that the babies are trying to communicate; they are just as likely to babble when they are alone as when they are with their caregivers (Fernández & Cairns, 2011). Interestingly, babies who are raised in environments in which sign language is used will also begin to show babbling in the gestures of their hands during this stage (Petitto, Holowka, Sergio, Levy, & Ostry, 2004).

Generally, a child’s first word is uttered sometime between the ages of 1 year to 18 months, and for the next few months, the child will remain in the “one word” stage of language development. During this time, children know a number of words, but they only produce one-word utterances. The child’s early vocabulary is limited to familiar objects or events, often nouns. Although children in this stage only make one-word utterances, these words often carry larger meaning (Fernández & Cairns, 2011). So, for example, a child saying “cookie” could be identifying a cookie or asking for a cookie.

As a child’s lexicon grows, she begins to utter simple sentences and to acquire new vocabulary at a very rapid pace. In addition, children begin to demonstrate a clear understanding of the specific rules that apply to their language(s). Even the mistakes that children sometimes make provide evidence of just how much they understand about those rules. This is sometimes seen in the form of  overgeneralization . In this context, overgeneralization refers to an extension of a language rule to an exception to the rule. For example, in English, it is usually the case that an “s” is added to the end of a word to indicate plurality. For example, we speak of one dog versus two dogs. Young children will overgeneralize this rule to cases that are exceptions to the “add an s to the end of the word” rule and say things like “those two gooses” or “three mouses.” Clearly, the rules of the language are understood, even if the exceptions to the rules are still being learned (Moskowitz, 1978).

Language and Thought

When we speak one language, we agree that words are representations of ideas, people, places, and events. The given language that children learn is connected to their culture and surroundings. But can words themselves shape the way we think about things? Psychologists have long investigated the question of whether language shapes thoughts and actions, or whether our thoughts and beliefs shape our language. Two researchers, Edward Sapir and Benjamin Lee Whorf, began this investigation in the 1940s. They wanted to understand how the language habits of a community encourage members of that community to interpret language in a particular manner (Sapir, 1941/1964). Sapir and Whorf proposed that language determines thought. For example, in some languages there are many different words for love. However, in English we use the word love for all types of love. Does this affect how we think about love depending on the language that we speak (Whorf, 1956)? Researchers have since identified this view as too absolute, pointing out a lack of empiricism behind what Sapir and Whorf proposed (Abler, 2013; Boroditsky, 2011; van Troyer, 1994). Today, psychologists continue to study and debate the relationship between language and thought.

WHAT DO YOU THINK? The Meaning of Language

Think about what you know of other languages; perhaps you even speak multiple languages. Imagine for a moment that your closest friend fluently speaks more than one language. Do you think that friend thinks differently, depending on which language is being spoken? You may know a few words that are not translatable from their original language into English. For example, the Portuguese word  saudade  originated during the 15th century, when Portuguese sailors left home to explore the seas and travel to Africa or Asia. Those left behind described the emptiness and fondness they felt as  saudade  ( Figure 7.6 ) .  The word came to express many meanings, including loss, nostalgia, yearning, warm memories, and hope. There is no single word in English that includes all of those emotions in a single description. Do words such as  saudade  indicate that different languages produce different patterns of thought in people? What do you think??

Two paintings are depicted in a and b. A depicts a young boy leaning on a trunk. He looks forlornly past the viewer. B depicts a woman wrapped in a black shawl standing near a window. She reads a letter while holding the shawl to her mouth.

One group of researchers who wanted to investigate how language influences thought compared how English speakers and the Dani people of Papua New Guinea think and speak about color. The Dani have two words for color: one word for  light  and one word for  dark . In contrast, the English language has 11 color words. Researchers hypothesized that the number of color terms could limit the ways that the Dani people conceptualized color. However, the Dani were able to distinguish colors with the same ability as English speakers, despite having fewer words at their disposal (Berlin & Kay, 1969). A recent review of research aimed at determining how language might affect something like color perception suggests that language can influence perceptual phenomena, especially in the left hemisphere of the brain. You may recall from earlier chapters that the left hemisphere is associated with language for most people. However, the right (less linguistic hemisphere) of the brain is less affected by linguistic influences on perception (Regier & Kay, 2009)

7.3 Problem Solving

  • Describe problem solving strategies
  • Define algorithm and heuristic
  • Explain some common roadblocks to effective problem solving and decision making

People face problems every day—usually, multiple problems throughout the day. Sometimes these problems are straightforward: To double a recipe for pizza dough, for example, all that is required is that each ingredient in the recipe be doubled. Sometimes, however, the problems we encounter are more complex. For example, say you have a work deadline, and you must mail a printed copy of a report to your supervisor by the end of the business day. The report is time-sensitive and must be sent overnight. You finished the report last night, but your printer will not work today. What should you do? First, you need to identify the problem and then apply a strategy for solving the problem.

Problem-Solving Strategies

When you are presented with a problem—whether it is a complex mathematical problem or a broken printer, how do you solve it? Before finding a solution to the problem, the problem must first be clearly identified. After that, one of many problem solving strategies can be applied, hopefully resulting in a solution.

A  problem-solving strategy  is a plan of action used to find a solution. Different strategies have different action plans associated with them ( Table 7.2 ). For example, a well-known strategy is  trial and error . The old adage, “If at first you don’t succeed, try, try again” describes trial and error. In terms of your broken printer, you could try checking the ink levels, and if that doesn’t work, you could check to make sure the paper tray isn’t jammed. Or maybe the printer isn’t actually connected to your laptop. When using trial and error, you would continue to try different solutions until you solved your problem. Although trial and error is not typically one of the most time-efficient strategies, it is a commonly used one.

Another type of strategy is an algorithm. An  algorithm  is a problem-solving formula that provides you with step-by-step instructions used to achieve a desired outcome (Kahneman, 2011). You can think of an algorithm as a recipe with highly detailed instructions that produce the same result every time they are performed. Algorithms are used frequently in our everyday lives, especially in computer science. When you run a search on the Internet, search engines like Google use algorithms to decide which entries will appear first in your list of results. Facebook also uses algorithms to decide which posts to display on your newsfeed. Can you identify other situations in which algorithms are used?

A heuristic is another type of problem solving strategy. While an algorithm must be followed exactly to produce a correct result, a  heuristic  is a general problem-solving framework (Tversky & Kahneman, 1974). You can think of these as mental shortcuts that are used to solve problems. A “rule of thumb” is an example of a heuristic. Such a rule saves the person time and energy when making a decision, but despite its time-saving characteristics, it is not always the best method for making a rational decision. Different types of heuristics are used in different types of situations, but the impulse to use a heuristic occurs when one of five conditions is met (Pratkanis, 1989):

  • When one is faced with too much information
  • When the time to make a decision is limited
  • When the decision to be made is unimportant
  • When there is access to very little information to use in making the decision
  • When an appropriate heuristic happens to come to mind in the same moment

Working backwards  is a useful heuristic in which you begin solving the problem by focusing on the end result. Consider this example: You live in Washington, D.C. and have been invited to a wedding at 4 PM on Saturday in Philadelphia. Knowing that Interstate 95 tends to back up any day of the week, you need to plan your route and time your departure accordingly. If you want to be at the wedding service by 3:30 PM, and it takes 2.5 hours to get to Philadelphia without traffic, what time should you leave your house? You use the working backwards heuristic to plan the events of your day on a regular basis, probably without even thinking about it.

Another useful heuristic is the practice of accomplishing a large goal or task by breaking it into a series of smaller steps. Students often use this common method to complete a large research project or long essay for school. For example, students typically brainstorm, develop a thesis or main topic, research the chosen topic, organize their information into an outline, write a rough draft, revise and edit the rough draft, develop a final draft, organize the references list, and proofread their work before turning in the project. The large task becomes less overwhelming when it is broken down into a series of small steps.

EVERYDAY CONNECTION: Solving Puzzles

Problem-solving abilities can improve with practice. Many people challenge themselves every day with puzzles and other mental exercises to sharpen their problem-solving skills. Sudoku puzzles appear daily in most newspapers. Typically, a sudoku puzzle is a 9×9 grid. The simple sudoku below ( Figure 7.7 ) is a 4×4 grid. To solve the puzzle, fill in the empty boxes with a single digit: 1, 2, 3, or 4. Here are the rules: The numbers must total 10 in each bolded box, each row, and each column; however, each digit can only appear once in a bolded box, row, and column. Time yourself as you solve this puzzle and compare your time with a classmate.

A sudoku puzzle is pictured. The puzzle is a 4x4 square with each sub-square also divided into four. Inside the top left square, the numbers are 3, blank, blank, 4 from left-to-right and top-to-bottom. In the top right square, the numbers are blank, two, one, blank. In the bottom left square, the numbers are blank, 3, four, blank; and the bottom right square contains 2, blank, blank, 1.

Here is another popular type of puzzle ( Figure 7.8 ) that challenges your spatial reasoning skills. Connect all nine dots with four connecting straight lines without lifting your pencil from the paper:

Nine dots are arrayed in three rows of three.

Not all problems are successfully solved, however. What challenges stop us from successfully solving a problem? Albert Einstein once said, “Insanity is doing the same thing over and over again and expecting a different result.” Imagine a person in a room that has four doorways. One doorway that has always been open in the past is now locked. The person, accustomed to exiting the room by that particular doorway, keeps trying to get out through the same doorway even though the other three doorways are open. The person is stuck—but she just needs to go to another doorway, instead of trying to get out through the locked doorway. A  mental set  is where you persist in approaching a problem in a way that has worked in the past but is clearly not working now.

The top figure shows a book of matches, a box of tacks, and a candle. The bottom figure shows the box tacked to the wall with the candle standing in the box.

Functional fixedness  is a type of mental set where you cannot perceive an object being used for something other than what it was designed for. Duncker (1945) conducted foundational research on functional fixedness. He created an experiment in which participants were given a candle, a book of matches, and a box of thumbtacks. They were instructed to use those items to attach the candle to the wall so that it did not drip wax onto the table below. Participants had to use functional fixedness to solve the problem ( Figure 7.10 ). During the  Apollo 13  mission to the moon, NASA engineers at Mission Control had to overcome functional fixedness to save the lives of the astronauts aboard the spacecraft. An explosion in a module of the spacecraft damaged multiple systems. The astronauts were in danger of being poisoned by rising levels of carbon dioxide because of problems with the carbon dioxide filters. The engineers found a way for the astronauts to use spare plastic bags, tape, and air hoses to create a makeshift air filter, which saved the lives of the astronauts.

Researchers have investigated whether functional fixedness is affected by culture. In one experiment, individuals from the Shuar group in Ecuador were asked to use an object for a purpose other than that for which the object was originally intended. For example, the participants were told a story about a bear and a rabbit that were separated by a river and asked to select among various objects, including a spoon, a cup, erasers, and so on, to help the animals. The spoon was the only object long enough to span the imaginary river, but if the spoon was presented in a way that reflected its normal usage, it took participants longer to choose the spoon to solve the problem. (German & Barrett, 2005). The researchers wanted to know if exposure to highly specialized tools, as occurs with individuals in industrialized nations, affects their ability to transcend functional fixedness. It was determined that functional fixedness is experienced in both industrialized and nonindustrialized cultures (German & Barrett, 2005).

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. For example, let’s say you and three friends wanted to rent a house and had a combined target budget of $1,600. The realtor shows you only very run-down houses for $1,600 and then shows you a very nice house for $2,000. Might you ask each person to pay more in rent to get the $2,000 home? Why would the realtor show you the run-down houses and the nice house? The realtor may be challenging your anchoring bias. An  anchoring bias  occurs when you focus on one piece of information when making a decision or solving a problem. In this case, you’re so focused on the amount of money you are willing to spend that you may not recognize what kinds of houses are available at that price point.

The  confirmation bias  is the tendency to focus on information that confirms your existing beliefs. For example, if you think that your professor is not very nice, you notice all of the instances of rude behavior exhibited by the professor while ignoring the countless pleasant interactions he is involved in on a daily basis.  Hindsight bias  leads you to believe that the event you just experienced was predictable, even though it really wasn’t. In other words, you knew all along that things would turn out the way they did.  Representative bias  describes a faulty way of thinking, in which you unintentionally stereotype someone or something; for example, you may assume that your professors spend their free time reading books and engaging in intellectual conversation, because the idea of them spending their time playing volleyball or visiting an amusement park does not fit in with your stereotypes of professors.

Finally, the  availability heuristic  is a heuristic in which you make a decision based on an example, information, or recent experience that is that readily available to you, even though it may not be the best example to inform your decision .  Biases tend to “preserve that which is already established—to maintain our preexisting knowledge, beliefs, attitudes, and hypotheses” (Aronson, 1995; Kahneman, 2011). These biases are summarized in  Table 7.3 .

Were you able to determine how many marbles are needed to balance the scales in  Figure 7.9 ? You need nine. Were you able to solve the problems in  Figure 7.7  and  Figure 7.8 ? Here are the answers ( Figure 7.11 ).

image

Chapter Summary

7.1 what is cognition.

In this section, you were introduced to cognitive psychology, which is the study of cognition, or the brain’s ability to think, perceive, plan, analyze, and remember. Concepts and their corresponding prototypes help us quickly organize our thinking by creating categories into which we can sort new information. We also develop schemata, which are clusters of related concepts. Some schemata involve routines of thought and behavior, and these help us function properly in various situations without having to “think twice” about them. Schemata show up in social situations and routines of daily behavior.

7.2 Language

Language is a communication system that has both a lexicon and a system of grammar. Language acquisition occurs naturally and effortlessly during the early stages of life, and this acquisition occurs in a predictable sequence for individuals around the world. Language has a strong influence on thought, and the concept of how language may influence cognition remains an area of study and debate in psychology.

Many different strategies exist for solving problems. Typical strategies include trial and error, applying algorithms, and using heuristics. To solve a large, complicated problem, it often helps to break the problem into smaller steps that can be accomplished individually, leading to an overall solution. Roadblocks to problem solving include a mental set, functional fixedness, and various biases that can cloud decision making skills.

thinking; or, all of the processes associated with perception, knowledge, problem solving, judgement, language, and memory.

A modern school of psychological thought that empirically examines mental processes such as perception, memory, language, and judgement.

a category or grouping of linguistic information, images, ideas or memories, such as life experiences.

knowledge about words, concepts, and language-based knowledge and facts

the best example or representation of a concept, specific to an individual

concepts developed through direct or indirect experiences with the world

a concept defined by a specific set of characteristics.

a mental construct consisting of a cluster of related concepts

a set of ideas relating to how individuals in certain roles will behave.

also known as a cognitive script; a set of behaviors associated with a particular place or event

also known as an event schema; a set of behaviors associated with a particular place or event

a communication system that involves using words and systematic rules to organize those words to transmit information from one individual to another.

the words of a language

the rules of a language used to convey meaning through the use of the lexicon

the basic sounds that make up a language

the smallest unit of language that conveys meaning

the process by which we derive meaning from morphemes and words

the rules guiding the organization of morphemes into words and words into sentences.

Psychology 2e Copyright © 2020 by Openstax is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

  • 7.1 What Is Cognition?
  • Introduction
  • 1.1 What Is Psychology?
  • 1.2 History of Psychology
  • 1.3 Contemporary Psychology
  • 1.4 Careers in Psychology
  • Review Questions
  • Critical Thinking Questions
  • Personal Application Questions
  • 2.1 Why Is Research Important?
  • 2.2 Approaches to Research
  • 2.3 Analyzing Findings
  • 3.1 Human Genetics
  • 3.2 Cells of the Nervous System
  • 3.3 Parts of the Nervous System
  • 3.4 The Brain and Spinal Cord
  • 3.5 The Endocrine System
  • 4.1 What Is Consciousness?
  • 4.2 Sleep and Why We Sleep
  • 4.3 Stages of Sleep
  • 4.4 Sleep Problems and Disorders
  • 4.5 Substance Use and Abuse
  • 4.6 Other States of Consciousness
  • 5.1 Sensation versus Perception
  • 5.2 Waves and Wavelengths
  • 5.4 Hearing
  • 5.5 The Other Senses
  • 5.6 Gestalt Principles of Perception
  • 6.1 What Is Learning?
  • 6.2 Classical Conditioning
  • 6.3 Operant Conditioning
  • 6.4 Observational Learning (Modeling)
  • 7.2 Language
  • 7.3 Problem Solving
  • 7.4 What Are Intelligence and Creativity?
  • 7.5 Measures of Intelligence
  • 7.6 The Source of Intelligence
  • 8.1 How Memory Functions
  • 8.2 Parts of the Brain Involved with Memory
  • 8.3 Problems with Memory
  • 8.4 Ways to Enhance Memory
  • 9.1 What Is Lifespan Development?
  • 9.2 Lifespan Theories
  • 9.3 Stages of Development
  • 9.4 Death and Dying
  • 10.1 Motivation
  • 10.2 Hunger and Eating
  • 10.3 Sexual Behavior, Sexuality, and Gender Identity
  • 10.4 Emotion
  • 11.1 What Is Personality?
  • 11.2 Freud and the Psychodynamic Perspective
  • 11.3 Neo-Freudians: Adler, Erikson, Jung, and Horney
  • 11.4 Learning Approaches
  • 11.5 Humanistic Approaches
  • 11.6 Biological Approaches
  • 11.7 Trait Theorists
  • 11.8 Cultural Understandings of Personality
  • 11.9 Personality Assessment
  • 12.1 What Is Social Psychology?
  • 12.2 Self-presentation
  • 12.3 Attitudes and Persuasion
  • 12.4 Conformity, Compliance, and Obedience
  • 12.5 Prejudice and Discrimination
  • 12.6 Aggression
  • 12.7 Prosocial Behavior
  • 13.1 What Is Industrial and Organizational Psychology?
  • 13.2 Industrial Psychology: Selecting and Evaluating Employees
  • 13.3 Organizational Psychology: The Social Dimension of Work
  • 13.4 Human Factors Psychology and Workplace Design
  • 14.1 What Is Stress?
  • 14.2 Stressors
  • 14.3 Stress and Illness
  • 14.4 Regulation of Stress
  • 14.5 The Pursuit of Happiness
  • 15.1 What Are Psychological Disorders?
  • 15.2 Diagnosing and Classifying Psychological Disorders
  • 15.3 Perspectives on Psychological Disorders
  • 15.4 Anxiety Disorders
  • 15.5 Obsessive-Compulsive and Related Disorders
  • 15.6 Posttraumatic Stress Disorder
  • 15.7 Mood and Related Disorders
  • 15.8 Schizophrenia
  • 15.9 Dissociative Disorders
  • 15.10 Disorders in Childhood
  • 15.11 Personality Disorders
  • 16.1 Mental Health Treatment: Past and Present
  • 16.2 Types of Treatment
  • 16.3 Treatment Modalities
  • 16.4 Substance-Related and Addictive Disorders: A Special Case
  • 16.5 The Sociocultural Model and Therapy Utilization

Learning Objectives

By the end of this section, you will be able to:

  • Describe cognition
  • Distinguish concepts and prototypes
  • Explain the difference between natural and artificial concepts
  • Describe how schemata are organized and constructed

Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. This is only one facet of the complex processes involved in cognition. Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

Upon waking each morning, you begin thinking—contemplating the tasks that you must complete that day. In what order should you run your errands? Should you go to the bank, the cleaners, or the grocery store first? Can you get these things done before you head to class or will they need to wait until school is done? These thoughts are one example of cognition at work. Exceptionally complex, cognition is an essential feature of human consciousness, yet not all aspects of cognition are consciously experienced.

Cognitive psychology is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes. Cognitive psychologists strive to determine and measure different types of intelligence, why some people are better at problem solving than others, and how emotional intelligence affects success in the workplace, among countless other topics. They also sometimes focus on how we organize thoughts and information gathered from our environments into meaningful categories of thought, which will be discussed later.

Concepts and Prototypes

The human nervous system is capable of handling endless streams of information. The senses serve as the interface between the mind and the external environment, receiving stimuli and translating it into nervous impulses that are transmitted to the brain. The brain then processes this information and uses the relevant pieces to create thoughts, which can then be expressed through language or stored in memory for future use. To make this process more complex, the brain does not gather information from external environments only. When thoughts are formed, the mind synthesizes information from emotions and memories ( Figure 7.2 ). Emotion and memory are powerful influences on both our thoughts and behaviors.

In order to organize this staggering amount of information, the mind has developed a "file cabinet" of sorts. The different files stored in the file cabinet are called concepts. Concepts are categories or groupings of linguistic information, images, ideas, or memories, such as life experiences. Concepts are, in many ways, big ideas that are generated by observing details, and categorizing and combining these details into cognitive structures. You use concepts to see the relationships among the different elements of your experiences and to keep the information in your mind organized and accessible.

Concepts are informed by our semantic memory (you will learn more about semantic memory in a later chapter) and are present in every aspect of our lives; however, one of the easiest places to notice concepts is inside a classroom, where they are discussed explicitly. When you study United States history, for example, you learn about more than just individual events that have happened in America’s past. You absorb a large quantity of information by listening to and participating in discussions, examining maps, and reading first-hand accounts of people’s lives. Your brain analyzes these details and develops an overall understanding of American history. In the process, your brain gathers details that inform and refine your understanding of related concepts such as war, the judicial system, and voting rights and laws.

Concepts can be complex and abstract, like justice, or more concrete, like types of birds. In psychology, for example, Piaget’s stages of development are abstract concepts. Some concepts, like tolerance, are agreed upon by many people, because they have been used in various ways over many years. Other concepts, like the characteristics of your ideal friend or your family’s birthday traditions, are personal and individualized. In this way, concepts touch every aspect of our lives, from our many daily routines to the guiding principles behind the way governments function.

Another technique used by your brain to organize information is the identification of prototypes for the concepts you have developed. A prototype is the best example or representation of a concept. For example, what comes to your mind when you think of a dog? Most likely your early experiences with dogs will shape what you imagine. If your first pet was a Golden Retriever, there is a good chance that this would be your prototype for the category of dogs.

Natural and Artificial Concepts

In psychology, concepts can be divided into two categories, natural and artificial. Natural concepts are created “naturally” through your experiences and can be developed from either direct or indirect experiences. For example, if you live in Essex Junction, Vermont, you have probably had a lot of direct experience with snow. You’ve watched it fall from the sky, you’ve seen lightly falling snow that barely covers the windshield of your car, and you’ve shoveled out 18 inches of fluffy white snow as you’ve thought, “This is perfect for skiing.” You’ve thrown snowballs at your best friend and gone sledding down the steepest hill in town. In short, you know snow. You know what it looks like, smells like, tastes like, and feels like. If, however, you’ve lived your whole life on the island of Saint Vincent in the Caribbean, you may never actually have seen snow, much less tasted, smelled, or touched it. You know snow from the indirect experience of seeing pictures of falling snow—or from watching films that feature snow as part of the setting. Either way, snow is a natural concept because you can construct an understanding of it through direct observations, experiences with snow, or indirect knowledge (such as from films or books) ( Figure 7.3 ).

An artificial concept , on the other hand, is a concept that is defined by a specific set of characteristics. Various properties of geometric shapes, like squares and triangles, serve as useful examples of artificial concepts. A triangle always has three angles and three sides. A square always has four equal sides and four right angles. Mathematical formulas, like the equation for area (length × width) are artificial concepts defined by specific sets of characteristics that are always the same. Artificial concepts can enhance the understanding of a topic by building on one another. For example, before learning the concept of “area of a square” (and the formula to find it), you must understand what a square is. Once the concept of “area of a square” is understood, an understanding of area for other geometric shapes can be built upon the original understanding of area. The use of artificial concepts to define an idea is crucial to communicating with others and engaging in complex thought. According to Goldstone and Kersten (2003), concepts act as building blocks and can be connected in countless combinations to create complex thoughts.

A schema is a mental construct consisting of a cluster or collection of related concepts (Bartlett, 1932). There are many different types of schemata, and they all have one thing in common: schemata are a method of organizing information that allows the brain to work more efficiently. When a schema is activated, the brain makes immediate assumptions about the person or object being observed.

There are several types of schemata. A role schema makes assumptions about how individuals in certain roles will behave (Callero, 1994). For example, imagine you meet someone who introduces himself as a firefighter. When this happens, your brain automatically activates the “firefighter schema” and begins making assumptions that this person is brave, selfless, and community-oriented. Despite not knowing this person, already you have unknowingly made judgments about them. Schemata also help you fill in gaps in the information you receive from the world around you. While schemata allow for more efficient information processing, there can be problems with schemata, regardless of whether they are accurate: Perhaps this particular firefighter is not brave, they just work as a firefighter to pay the bills while studying to become a children’s librarian.

An event schema , also known as a cognitive script , is a set of behaviors that can feel like a routine. Think about what you do when you walk into an elevator ( Figure 7.4 ). First, the doors open and you wait to let exiting passengers leave the elevator car. Then, you step into the elevator and turn around to face the doors, looking for the correct button to push. You never face the back of the elevator, do you? And when you’re riding in a crowded elevator and you can’t face the front, it feels uncomfortable, doesn’t it? Interestingly, event schemata can vary widely among different cultures and countries. For example, while it is quite common for people to greet one another with a handshake in the United States, in Tibet, you greet someone by sticking your tongue out at them, and in Belize, you bump fists (Cairns Regional Council, n.d.)

Because event schemata are automatic, they can be difficult to change. Imagine that you are driving home from work or school. This event schema involves getting in the car, shutting the door, and buckling your seatbelt before putting the key in the ignition. You might perform this script two or three times each day. As you drive home, you hear your phone’s ring tone. Typically, the event schema that occurs when you hear your phone ringing involves locating the phone and answering it or responding to your latest text message. So without thinking, you reach for your phone, which could be in your pocket, in your bag, or on the passenger seat of the car. This powerful event schema is informed by your pattern of behavior and the pleasurable stimulation that a phone call or text message gives your brain. Because it is a schema, it is extremely challenging for us to stop reaching for the phone, even though we know that we endanger our own lives and the lives of others while we do it (Neyfakh, 2013) ( Figure 7.5 ).

Remember the elevator? It feels almost impossible to walk in and not face the door. Our powerful event schema dictates our behavior in the elevator, and it is no different with our phones. Current research suggests that it is the habit, or event schema, of checking our phones in many different situations that makes refraining from checking them while driving especially difficult (Bayer & Campbell, 2012). Because texting and driving has become a dangerous epidemic in recent years, psychologists are looking at ways to help people interrupt the “phone schema” while driving. Event schemata like these are the reason why many habits are difficult to break once they have been acquired. As we continue to examine thinking, keep in mind how powerful the forces of concepts and schemata are to our understanding of the world.

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/psychology-2e/pages/1-introduction
  • Authors: Rose M. Spielman, William J. Jenkins, Marilyn D. Lovett
  • Publisher/website: OpenStax
  • Book title: Psychology 2e
  • Publication date: Apr 22, 2020
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/psychology-2e/pages/1-introduction
  • Section URL: https://openstax.org/books/psychology-2e/pages/7-1-what-is-cognition

© Jan 6, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Complex cognition: the science of human reasoning, problem-solving, and decision-making

  • Published: 23 March 2010
  • Volume 11 , pages 99–102, ( 2010 )

Cite this article

  • Markus Knauff 1 &
  • Ann G. Wolf 1  

19k Accesses

37 Citations

Explore all metrics

Avoid common mistakes on your manuscript.

Climate change, globalization, policy of peace, and financial market crises—often we are faced with very complex problems. In order to tackle these complex problems, the responsible people should first come to mutual terms. An additional challenge is that typically the involved parties have different (often conflicting) interests and relate the problems to different emotions and wishes. These factors certainly do not ease the quest for a solution to these complex problems.

It is needless to say that the big problems of our time are not easy to solve. Less clear, however, is identifying the causes that led to these problems. Interest conflicts between social groups, the economic and social system or greed—one can think of many responsible factors for the large-scale problems we are currently confronted with.

The present “Special Corner: complex cognition” deals with questions in this regard that have often received little consideration. Under the headline “complex cognition”, we summarize mental activities such as thinking, reasoning, problem - solving, and decision - making that typically rely on the combination and interaction of more elementary processes such as perception, learning, memory, emotion, etc. (cf. Sternberg and Ben-Zeev 2001 ). However, even though complex cognition relies on these elementary functions, the scope of complex cognition research goes beyond the isolated analysis of such elementary mental processes. Two aspects are essential for “complex cognition”: The first aspect refers to the interaction of different mental activities such as perception, memory, learning, reasoning, emotion, etc. The second aspect takes the complexity of the situation into account an agent is confronted with. Based on these two aspects, the term “complex cognition” can be defined in the following way:

Complex psychological processes: We talk about “complex cognition”, when thinking, problem-solving, or decision-making falls back on other cognitive processes such as “perception”, “working memory”, “long-term memory”, “executive processes”, or when the cognitive processes are in close connection with other processes such as “emotion” and “motivation”. The complexity also results from an interaction from a multitude of processes that occur simultaneously or at different points in time and can be realized in different cognitive and/or neuronal structures.

Complex conditions: We also talk about “complex cognition” when the conditions are complex in which a person finds himself and in which conclusions need to be drawn, a problem needs to be solved, or decisions need to be made. The complexity of the conditions or constraints can have different causes. The situation structure itself can be difficult to “see”, or the action alternatives are difficult “to put into effect”. The conditions can themselves comprise of many different variables. These variables can exhibit a high level of interdependence and cross-connection, and it can, as time passes by, come to a change of the original conditions (e.g. Dörner and Wearing 1995 ; Osman 2010 ). It can also be the case that the problem is embedded in a larger social context and can be solved only under certain specifications (norms, data, legislations, culture, etc.) or that the problem can only be solved in interaction with other agents, be it other persons or technical systems.

When one summarizes these two aspects, this yields the following view of what should be understood as “complex cognition”.

As “complex cognition” we define all mental processes that are used by individuals for deriving new information out of given information, with the intention to solve problems, make decision, and plan actions. The crucial characteristic of “complex cognition” is that it takes place under complex conditions in which a multitude of cognitive processes interact with one another or with other noncognitive processes.

The “Special Corner: complex cognition” deals with complex cognition from many different perspectives. The typical questions of all contributions are: Does the design of the human mind enable the necessary thinking skills to solve the truly complex problems we are faced with? Where lay the boundaries of our thinking skills? How do people derive at conclusions? What makes a problem a complex problem? How can we improve our skills to effectively solve problems and make sound judgements?

It is for sure too much to expect that the Special Corner answers these questions. If it were that easy, we would not be still searching for an answer. It is, however, our intention with the current collection of articles to bring to focus such questions to a larger extent than has been done so far.

An important starting point is the fact that people’s skills to solve the most complex of all problems and to ponder about the most complex issues is often immense—humankind would not otherwise be there were she is now. Yet, on the other hand, it has become more clear in the past few years that often people drift away from what one would identify as “rational” (Kahneman 2003 ). People hardly ever adhere to that what the norms of logic, the probability calculus, or the mathematical decision theory state. For example, most people (and organizations) typically accept more losses for a potential high gain than would be the case if they were to take into account the rules of the probability theory. Similarly, they draw conclusions from received information in a way that is not according to the rules of logic. When people, for example, accept the rule “If it rains, then the street is wet”, they most often conclude that when the street is wet, it must have rained. That, however, is incorrect from a logical perspective: perhaps a cleaning car just drove by. In psychology, two main views are traditionally put forward to explain how such deviations from the normative guidelines occur. One scientific stream is interested in how deviations from the normative models can be explained (Evans 2005 ; Johnson-Laird 2008 ; Knauff 2007 ; Reason 1990 ). According to this line of research, deviations are caused by the limitations of the human cognitive system. The other psychological stream puts forward as the main criticism that the deviations can actually be regarded as mistakes (Gigerenzer 2008 ). The deviations accordingly have a high value, because they are adjusted to the information structure of the environment (Gigerenzer et al. 1999 ). They have probably developed during evolution, because they could ensure survival as for example the specifications of formal logic (Hertwig and Herzog 2009 ). We, the editors of the special corner, are very pleased that we can offer an impression of this debate with the contributions from Marewski, Gaissmaier, and Gigerenzer and the commentaries to this contribution from Evans and Over. Added to this is a reply from Marewski, Gaissmaier, and Gigerenzer to the commentary from Evans and Over.

Another topic in the area of complex cognition can be best illustrated by means of the climate protection. To be successful in this area, the responsible actors have to consider a multitude of ecological, biological, geological, political, and economical factors, the basic conditions are constantly at change, and the intervention methods are not clear. Because the necessary information is not readily available for the person dealing with the problem, the person is forced to obtain the relevant information from other sources. Furthermore, intervention in the complex variable structure of the climate can trigger processes whose impact was likely not intended. Finally, the system will not “wait” for intervention of the actors but will change itself over time. The special corner is also concerned with thinking and problem-solving in such complex situations. The article by Funke gives an overview of the current state of research on this topic from the viewpoint of the author, in which several research areas are covered that have internationally not received much acknowledgement (but see, for example, Osman 2010 ).

Although most contributions to the special corner come from the area of psychology, the contribution by Ragni and Löffler illustrates that computer science can provide a valuable addition to the understanding of complex cognition. Computer science plays an important role in complex cognition. In general, computer science, which is used to investigate computational processes central to all research approaches, can be placed in a “computational theory of cognition” framework. This is true especially for the development of computational theories of complex cognitive processes. In many of our modern knowledge domains, the application of simulations and modelling has become a major part of the methods inventory. Simulations help forecast the weather and climate change, help govern traffic flow and help comprehend physical processes. Although modelling in these areas is a vastly established method, it has been very little applied in the area of human thinking (but see e.g. Anderson 1990 ; Gray 2007 ). However, exactly in the area of complex cognition, the method of cognitive modelling offers empirical research an additional methodological access to the description and explanation of complex cognitive processes. While the validity of psychological theories can be tested with the use of empirical research, cognitive models, with their internal coherence, make possible to test consistency and completeness (e.g. Schmid 2008 ). They will also lead to new hypotheses that will in turn be possible to test experimentally. The contribution of Ragni and Löffler demonstrates with the help of an interesting example, finding the optimal route, the usefulness of simulation and modelling in psychology.

A further problem in the area of complex cognition is that many problems are solvable only under certain social conditions (norms, values, laws, culture) or only in interaction with other actors (cf. Beller 2008 ). The article on deontic reasoning by Beller is concerned with this topic. Deontic reasoning is thinking about whether actions are forbidden or allowed, obligatory or not obligatory. Beller proposes that social norms, imposing constraints on individual actions, constitute the fundamental concept for deontic thinking and that people reason from such norms flexibly according to deontic core principles. The review paper shows how knowing what in a certain situation is allowed or forbidden can influence how people derive at conclusions.

The article of Waldmann, Meder, von Sydow, and Hagmayer is concerned with the important topic of causal reasoning. More specifically, the authors explore the interaction between category and causal induction in causal model learning. The paper is a good example of how experimental work in psychology can combine different research traditions that typically work quite isolated. The paper goes beyond a divide and conquers approach and shows that causal knowledge plays an important role in learning, categorization, perception, decision-making, problem-solving, and text comprehension. In each of these fields, separate theories have been developed to investigate the role of causal knowledge. The first author of the paper is internationally well known for his work on the role of causality in other cognitive functions, in particular in categorization and learning (e.g. Lagnado et al. 2007 ; Waldmann et al. 1995 ). In a number of experimental studies, Waldmann and his colleagues have shown that people when learning about causal relations do not simply form associations between causes and effects but make use of abstract prior assumptions about the underlying causal structure and functional form (Waldmann 2007 ).

We, the guest editors, are very pleased that we have the opportunity with this Special corner to make accessible the topic “complex cognition” to the interdisciplinary readership of Cognitive Processing . We predict a bright future for this topic. The research topic possesses high research relevance in the area of basic research for a multitude of disciplines, for example psychology, computer science, and neuroscience. In addition, this area forms a good foundation for an interdisciplinary cooperation.

A further important reason for the positive development of the area is that the relevance of the area goes beyond fundamental research. In that way, the results of the area can for example also contribute to better understanding of the possibilities and borders of human thinking, problem-solving, and decisions in politics, corporations, and economy. In the long term, it might even lead to practical directions on how to avoid “mistakes” and help us better understand the global challenges of our time—Climate change, globalization, financial market crises, etc.

We thank all the authors for their insightful and inspiring contributions, a multitude of reviewers for their help, the editor-in-chief Marta Olivetti Belardinelli that she gave us the opportunity to address this topic, and the editorial manager, Thomas Hünefeldt, for his support for accomplishing the Special Corner. We wish the readers of the Special Corner lots of fun with reading the contributions!

Anderson JR (1990) The adaptive character of thought. Erlbaum, Hillsdale

Google Scholar  

Beller S (2008) Deontic norms, deontic reasoning, and deontic conditionals. Think Reason 14:305–341

Article   Google Scholar  

Dörner D, Wearing A (1995) Complex problem solving: toward a (computer-simulated) theory. In: Frensch PA, Funke J (eds) Complex problem solving: the European perspective. Lawrence Erlbaum Associates, Hillsdale, pp 65–99

Evans JSBT (2005) Deductive reasoning. In: Holyoak KJ, Morrison RG (eds) The Cambridge handbook of thinking and reasoning. Cambridge University Press, Cambridge, pp 169–184

Gigerenzer G (2008) Rationality for mortals: how people cope with uncertainty. Oxford University Press, Oxford

Gigerenzer G, Todd PM, The ABC Research Group (1999) Simple heuristics that make us smart. Oxford University Press, New York

Gray WD (2007) Integrated models of cognitive systems. Oxford University Press, Oxford

Hertwig R, Herzog SM (2009) Fast and frugal heuristics: tools of social rationality. Soc Cogn 27:661–698

Johnson-Laird PN (2008) Mental models and deductive reasoning. In: Rips L, Adler J (eds) Reasoning: studies in human inference and its foundations. Cambridge University Press, Cambridge, pp 206–222

Kahneman D (2003) A perspective on judgment and choice: mapping bounded rationality. Am Psychol 58:697–720

Article   PubMed   Google Scholar  

Knauff M (2007) How our brains reason logically. Topio 26:19–36

Lagnado DA, Waldmann MR, Hagmayer Y, Sloman SA (2007) Beyond covariation: cues to causal structure. In: Gopnik A, Schulz L (eds) Causal learning: psychology, philosophy, and computation. Oxford University Press, Oxford, pp 154–172

Osman M (2010) Controlling uncertainty: a review of human behavior in complex dynamic environments. Psychol Bull 136(1):65–86

Reason J (1990) Human error. Cambridge University Press, Cambridge

Schmid U (2008) Cognition and AI. KI 08/1, Themenheft “Kognition’’, pp 5–7

Sternberg RJ, Ben-Zeev T (2001) Complex cognition: the psychology of human thought. Oxford University Press, New York

Waldmann MR (2007) Combining versus analyzing multiple causes: how domain assumptions and task context affect integration rules. Cogn Sci 31:233–256

Waldmann MR, Holyoak KJ, Fratianne A (1995) Causal models and the acquisition of category structure. J Exp Psychol Gen 124:181–206

Download references

Author information

Authors and affiliations.

University of Giessen, Giessen, Germany

Markus Knauff & Ann G. Wolf

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Markus Knauff .

Rights and permissions

Reprints and permissions

About this article

Knauff, M., Wolf, A.G. Complex cognition: the science of human reasoning, problem-solving, and decision-making. Cogn Process 11 , 99–102 (2010). https://doi.org/10.1007/s10339-010-0362-z

Download citation

Received : 10 March 2010

Accepted : 10 March 2010

Published : 23 March 2010

Issue Date : May 2010

DOI : https://doi.org/10.1007/s10339-010-0362-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Ch 8: Thinking and Language

Thinking and language.

Three side by side images are shown. On the left is a person lying in the grass with a book, looking off into the distance. In the middle is a sculpture of a person sitting on rock, with chin rested on hand, and the elbow of that hand rested on knee. The third is a drawing of a person sitting cross-legged with his head resting on his hand, elbow on knee.

Why is it so difficult to break habits—like reaching for your ringing phone even when you shouldn’t, such as when you’re driving? Why is it hard to pay attention to a conversation when typing out a text message? How does a person who has never seen or touched snow in real life develop an understanding of the concept of snow? How do young children acquire the ability to learn language with no formal instruction? Psychologists who study thinking explore questions like these.

As a part of this discussion, we will consider thinking, and briefly explore the development and use of language. We will also discuss problem solving and creativity. After finishing this chapter, you will have a greater appreciation of the higher-level cognitive processes that contribute to our distinctiveness as a species.

Learning Objectives

  • Understand why selective attention is important and how it can be studied.
  • Learn about different models of when and how selection can occur.
  • Understand how divided attention or multitasking is studied, and implications of multitasking in situations such as distracted driving.

Thinking and Problem-Solving

A man sitting down in "The Thinker" pose.

Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. This is only one facet of the complex processes involved in cognition. Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

  • Distinguish between concepts and prototypes
  • Explain the difference between natural and artificial concepts
  • Describe problem solving strategies, including algorithms and heuristics
  • Explain some common roadblocks to effective problem solving

What is Cognition?

Categories and concepts, concepts and prototypes.

The human nervous system is capable of handling endless streams of information. The senses serve as the interface between the mind and the external environment, receiving stimuli and translating it into nerve impulses that are transmitted to the brain. The brain then processes this information and uses the relevant pieces to create thoughts, which can then be expressed through language or stored in memory for future use. To make this process more complex, the brain does not gather information from external environments only. When thoughts are formed, the brain also pulls information from emotions and memories (Figure 9). Emotion and memory are powerful influences on both our thoughts and behaviors.

The outline of a human head is shown. There is a box containing “Information, sensations” in front of the head. An arrow from this box points to another box containing “Emotions, memories” located where the person’s brain would be. An arrow from this second box points to a third box containing “Thoughts” behind the head.

In order to organize this staggering amount of information, the brain has developed a file cabinet of sorts in the mind. The different files stored in the file cabinet are called concepts. Concepts  are categories or groupings of linguistic information, images, ideas, or memories, such as life experiences. Concepts are, in many ways, big ideas that are generated by observing details, and categorizing and combining these details into cognitive structures. You use concepts to see the relationships among the different elements of your experiences and to keep the information in your mind organized and accessible.

Concepts are informed by our semantic memory (you will learn more about this concept when you study memory) and are present in every aspect of our lives; however, one of the easiest places to notice concepts is inside a classroom, where they are discussed explicitly. When you study United States history, for example, you learn about more than just individual events that have happened in America’s past. You absorb a large quantity of information by listening to and participating in discussions, examining maps, and reading first-hand accounts of people’s lives. Your brain analyzes these details and develops an overall understanding of American history. In the process, your brain gathers details that inform and refine your understanding of related concepts like democracy, power, and freedom.

Concepts can be complex and abstract, like justice, or more concrete, like types of birds. In psychology, for example, Piaget’s stages of development are abstract concepts. Some concepts, like tolerance, are agreed upon by many people because they have been used in various ways over many years. Other concepts, like the characteristics of your ideal friend or your family’s birthday traditions, are personal and individualized. In this way, concepts touch every aspect of our lives, from our many daily routines to the guiding principles behind the way governments function.

Concepts are at the core of intelligent behavior. We expect people to be able to know what to do in new situations and when confronting new objects. If you go into a new classroom and see chairs, a blackboard, a projector, and a screen, you know what these things are and how they will be used. You’ll sit on one of the chairs and expect the instructor to write on the blackboard or project something onto the screen. You do this even if you have never seen any of these particular objects before , because you have concepts of classrooms, chairs, projectors, and so forth, that tell you what they are and what you’re supposed to do with them. Furthermore, if someone tells you a new fact about the projector—for example, that it has a halogen bulb—you are likely to extend this fact to other projectors you encounter. In short, concepts allow you to extend what you have learned about a limited number of objects to a potentially infinite set of entities.

A photograph of Mohandas Gandhi is shown. There are several people walking with him.

Another technique used by your brain to organize information is the identification of prototypes for the concepts you have developed. A prototype  is the best example or representation of a concept. For example, for the category of civil disobedience, your prototype could be Rosa Parks. Her peaceful resistance to segregation on a city bus in Montgomery, Alabama, is a recognizable example of civil disobedience. Or your prototype could be Mohandas Gandhi, sometimes called Mahatma Gandhi (“Mahatma” is an honorific title) (Figure 10).

Mohandas Gandhi served as a nonviolent force for independence for India while simultaneously demanding that Buddhist, Hindu, Muslim, and Christian leaders—both Indian and British—collaborate peacefully. Although he was not always successful in preventing violence around him, his life provides a steadfast example of the civil disobedience prototype (Constitutional Rights Foundation, 2013). Just as concepts can be abstract or concrete, we can make a distinction between concepts that are functions of our direct experience with the world and those that are more artificial in nature.

Link to Learning

Natural and artificial concepts.

In psychology, concepts can be divided into two categories, natural and artificial. Natural concepts  are created “naturally” through your experiences and can be developed from either direct or indirect experiences. For example, if you live in Essex Junction, Vermont, you have probably had a lot of direct experience with snow. You’ve watched it fall from the sky, you’ve seen lightly falling snow that barely covers the windshield of your car, and you’ve shoveled out 18 inches of fluffy white snow as you’ve thought, “This is perfect for skiing.” You’ve thrown snowballs at your best friend and gone sledding down the steepest hill in town. In short, you know snow. You know what it looks like, smells like, tastes like, and feels like. If, however, you’ve lived your whole life on the island of Saint Vincent in the Caribbean, you may never have actually seen snow, much less tasted, smelled, or touched it. You know snow from the indirect experience of seeing pictures of falling snow—or from watching films that feature snow as part of the setting. Either way, snow is a natural concept because you can construct an understanding of it through direct observations or experiences of snow (Figure 11).

Photograph A shows a snow covered landscape with the sun shining over it. Photograph B shows a sphere shaped object perched atop the corner of a cube shaped object. There is also a triangular object shown.

An artificial concept  on the other hand, is a concept that is defined by a specific set of characteristics. Various properties of geometric shapes, like squares and triangles, serve as useful examples of artificial concepts. A triangle always has three angles and three sides. A square always has four equal sides and four right angles. Mathematical formulas, like the equation for area (length × width) are artificial concepts defined by specific sets of characteristics that are always the same. Artificial concepts can enhance the understanding of a topic by building on one another. For example, before learning the concept of “area of a square” (and the formula to find it), you must understand what a square is. Once the concept of “area of a square” is understood, an understanding of area for other geometric shapes can be built upon the original understanding of area. The use of artificial concepts to define an idea is crucial to communicating with others and engaging in complex thought. According to Goldstone and Kersten (2003), concepts act as building blocks and can be connected in countless combinations to create complex thoughts.

A schema is a mental construct consisting of a cluster or collection of related concepts (Bartlett, 1932). There are many different types of schemata, and they all have one thing in common: schemata are a method of organizing information that allows the brain to work more efficiently. When a schema is activated, the brain makes immediate assumptions about the person or object being observed.

There are several types of schemata. A role schema makes assumptions about how individuals in certain roles will behave (Callero, 1994). For example, imagine you meet someone who introduces himself as a firefighter. When this happens, your brain automatically activates the “firefighter schema” and begins making assumptions that this person is brave, selfless, and community-oriented. Despite not knowing this person, already you have unknowingly made judgments about him. Schemata also help you fill in gaps in the information you receive from the world around you. While schemata allow for more efficient information processing, there can be problems with schemata, regardless of whether they are accurate: Perhaps this particular firefighter is not brave, he just works as a firefighter to pay the bills while studying to become a children’s librarian.

An event schema , also known as a cognitive script , is a set of behaviors that can feel like a routine. Think about what you do when you walk into an elevator (Figure 12). First, the doors open and you wait to let exiting passengers leave the elevator car. Then, you step into the elevator and turn around to face the doors, looking for the correct button to push. You never face the back of the elevator, do you? And when you’re riding in a crowded elevator and you can’t face the front, it feels uncomfortable, doesn’t it? Interestingly, event schemata can vary widely among different cultures and countries. For example, while it is quite common for people to greet one another with a handshake in the United States, in Tibet, you greet someone by sticking your tongue out at them, and in Belize, you bump fists (Cairns Regional Council, n.d.)

A crowded elevator is shown. There are many people standing close to one another.

Because event schemata are automatic, they can be difficult to change. Imagine that you are driving home from work or school. This event schema involves getting in the car, shutting the door, and buckling your seatbelt before putting the key in the ignition. You might perform this script two or three times each day. As you drive home, you hear your phone’s ring tone. Typically, the event schema that occurs when you hear your phone ringing involves locating the phone and answering it or responding to your latest text message. So without thinking, you reach for your phone, which could be in your pocket, in your bag, or on the passenger seat of the car. This powerful event schema is informed by your pattern of behavior and the pleasurable stimulation that a phone call or text message gives your brain. Because it is a schema, it is extremely challenging for us to stop reaching for the phone, even though we know that we endanger our own lives and the lives of others while we do it (Neyfakh, 2013) (Figure 13).

A person’s right hand is holding a cellular phone. The person is in the driver’s seat of an automobile while on the road.

Remember the elevator? It feels almost impossible to walk in and not face the door. Our powerful event schema dictates our behavior in the elevator, and it is no different with our phones. Current research suggests that it is the habit, or event schema, of checking our phones in many different situations that makes refraining from checking them while driving especially difficult (Bayer & Campbell, 2012). Because texting and driving has become a dangerous epidemic in recent years, psychologists are looking at ways to help people interrupt the “phone schema” while driving. Event schemata like these are the reason why many habits are difficult to break once they have been acquired. As we continue to examine thinking, keep in mind how powerful the forces of concepts and schemata are to our understanding of the world.

Watch this CrashCourse video to see more examples of concepts and prototypes. You’ll also get a preview on other key topics in cognition, including problem-solving strategies like algorithms and heuristics.

You can view the transcript for “Cognition – How Your Mind Can Amaze and Betray You: Crash Course Psychology #15” here (opens in new window) .

Think It Over

People face problems every day—usually, multiple problems throughout the day. Sometimes these problems are straightforward: To double a recipe for pizza dough, for example, all that is required is that each ingredient in the recipe be doubled. Sometimes, however, the problems we encounter are more complex. For example, say you have a work deadline, and you must mail a printed copy of a report to your supervisor by the end of the business day. The report is time-sensitive and must be sent overnight. You finished the report last night, but your printer will not work today. What should you do? First, you need to identify the problem and then apply a strategy for solving the problem.

Problem-Solving Strategies

When you are presented with a problem—whether it is a complex mathematical problem or a broken printer, how do you solve it? Before finding a solution to the problem, the problem must first be clearly identified. After that, one of many problem solving strategies can be applied, hopefully resulting in a solution.

A problem-solving strategy is a plan of action used to find a solution. Different strategies have different action plans associated with them. For example, a well-known strategy is trial and error . The old adage, “If at first you don’t succeed, try, try again” describes trial and error. In terms of your broken printer, you could try checking the ink levels, and if that doesn’t work, you could check to make sure the paper tray isn’t jammed. Or maybe the printer isn’t actually connected to your laptop. When using trial and error, you would continue to try different solutions until you solved your problem. Although trial and error is not typically one of the most time-efficient strategies, it is a commonly used one.

Another type of strategy is an algorithm. An algorithm  is a problem-solving formula that provides you with step-by-step instructions used to achieve a desired outcome (Kahneman, 2011). You can think of an algorithm as a recipe with highly detailed instructions that produce the same result every time they are performed. Algorithms are used frequently in our everyday lives, especially in computer science. When you run a search on the Internet, search engines like Google use algorithms to decide which entries will appear first in your list of results. Facebook also uses algorithms to decide which posts to display on your newsfeed. Can you identify other situations in which algorithms are used?

A heuristic  is another type of problem solving strategy. While an algorithm must be followed exactly to produce a correct result, a heuristic is a general problem-solving framework (Tversky & Kahneman, 1974). You can think of these as mental shortcuts that are used to solve problems. A “rule of thumb” is an example of a heuristic. Such a rule saves the person time and energy when making a decision, but despite its time-saving characteristics, it is not always the best method for making a rational decision. Different types of heuristics are used in different types of situations, but the impulse to use a heuristic occurs when one of five conditions is met (Pratkanis, 1989):

  • When one is faced with too much information
  • When the time to make a decision is limited
  • When the decision to be made is unimportant
  • When there is access to very little information to use in making the decision
  • When an appropriate heuristic happens to come to mind in the same moment

Working backwards  is a useful heuristic in which you begin solving the problem by focusing on the end result. Consider this example: You live in Washington, D.C. and have been invited to a wedding at 4 PM on Saturday in Philadelphia. Knowing that Interstate 95 tends to back up any day of the week, you need to plan your route and time your departure accordingly. If you want to be at the wedding service by 3:30 PM, and it takes 2.5 hours to get to Philadelphia without traffic, what time should you leave your house? You use the working backwards heuristic to plan the events of your day on a regular basis, probably without even thinking about it.

What problem-solving method could you use to solve Einstein’s famous riddle?

You can view the transcript for “Can you solve “Einstein’s Riddle”? – Dan Van der Vieren” here (opens in new window) .

Another useful heuristic is the practice of accomplishing a large goal or task by breaking it into a series of smaller steps. Students often use this common method to complete a large research project or long essay for school. For example, students typically brainstorm, develop a thesis or main topic, research the chosen topic, organize their information into an outline, write a rough draft, revise and edit the rough draft, develop a final draft, organize the references list, and proofread their work before turning in the project. The large task becomes less overwhelming when it is broken down into a series of small steps.

Everyday Connections: Solving Puzzles

Problem-solving abilities can improve with practice. Many people challenge themselves every day with puzzles and other mental exercises to sharpen their problem-solving skills. Sudoku puzzles appear daily in most newspapers. Typically, a sudoku puzzle is a 9×9 grid. The simple sudoku below (Figure 14) is a 4×4 grid. To solve the puzzle, fill in the empty boxes with a single digit: 1, 2, 3, or 4. Here are the rules: The numbers must total 10 in each bolded box, each row, and each column; however, each digit can only appear once in a bolded box, row, and column. Time yourself as you solve this puzzle and compare your time with a classmate.

A four column by four row Sudoku puzzle is shown. The top left cell contains the number 3. The top right cell contains the number 2. The bottom right cell contains the number 1. The bottom left cell contains the number 4. The cell at the intersection of the second row and the second column contains the number 4. The cell to the right of that contains the number 1. The cell below the cell containing the number 1 contains the number 2. The cell to the left of the cell containing the number 2 contains the number 3.

Here is another popular type of puzzle that challenges your spatial reasoning skills. Connect all nine dots with four connecting straight lines without lifting your pencil from the paper:

A square shaped outline contains three rows and three columns of dots with equal space between them.

Take a look at the “Puzzling Scales” logic puzzle below (Figure 16). Sam Loyd, a well-known puzzle master, created and refined countless puzzles throughout his lifetime (Cyclopedia of Puzzles, n.d.).

A puzzle involving a scale is shown. At the top of the figure it reads: “Sam Loyds Puzzling Scales.” The first row of the puzzle shows a balanced scale with 3 blocks and a top on the left and 12 marbles on the right. Below this row it reads: “Since the scales now balance.” The next row of the puzzle shows a balanced scale with just the top on the left, and 1 block and 8 marbles on the right. Below this row it reads: “And balance when arranged this way.” The third row shows an unbalanced scale with the top on the left side, which is much lower than the right side. The right side is empty. Below this row it reads: “Then how many marbles will it require to balance with that top?”

Were you able to determine how many marbles are needed to balance the scales in the Puzzling Scales? You need nine. Were you able to solve the other problems above? Here are the answers:

The first puzzle is a Sudoku grid of 16 squares (4 rows of 4 squares) is shown. Half of the numbers were supplied to start the puzzle and are colored blue, and half have been filled in as the puzzle’s solution and are colored red. The numbers in each row of the grid, left to right, are as follows. Row 1: blue 3, red 1, red 4, blue 2. Row 2: red 2, blue 4, blue 1, red 3. Row 3: red 1, blue 3, blue 2, red 4. Row 4: blue 4, red 2, red 3, blue 1.The second puzzle consists of 9 dots arranged in 3 rows of 3 inside of a square. The solution, four straight lines made without lifting the pencil, is shown in a red line with arrows indicating the direction of movement. In order to solve the puzzle, the lines must extend beyond the borders of the box. The four connecting lines are drawn as follows. Line 1 begins at the top left dot, proceeds through the middle and right dots of the top row, and extends to the right beyond the border of the square. Line 2 extends from the end of line 1, through the right dot of the horizontally centered row, through the middle dot of the bottom row, and beyond the square’s border ending in the space beneath the left dot of the bottom row. Line 3 extends from the end of line 2 upwards through the left dots of the bottom, middle, and top rows. Line 4 extends from the end of line 3 through the middle dot in the middle row and ends at the right dot of the bottom row.

Pitfalls to Problem Solving

Not all problems are successfully solved, however. What challenges stop us from successfully solving a problem? Albert Einstein once said, “Insanity is doing the same thing over and over again and expecting a different result.” Imagine a person in a room that has four doorways. One doorway that has always been open in the past is now locked. The person, accustomed to exiting the room by that particular doorway, keeps trying to get out through the same doorway even though the other three doorways are open. The person is stuck—but she just needs to go to another doorway, instead of trying to get out through the locked doorway. A mental set  is where you persist in approaching a problem in a way that has worked in the past but is clearly not working now. Functional fixedness   is a type of mental set where you cannot perceive an object being used for something other than what it was designed for. During the Apollo 13 mission to the moon, NASA engineers at Mission Control had to overcome functional fixedness to save the lives of the astronauts aboard the spacecraft. An explosion in a module of the spacecraft damaged multiple systems. The astronauts were in danger of being poisoned by rising levels of carbon dioxide because of problems with the carbon dioxide filters. The engineers found a way for the astronauts to use spare plastic bags, tape, and air hoses to create a makeshift air filter, which saved the lives of the astronauts.

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. For example, let’s say you and three friends wanted to rent a house and had a combined target budget of $1,600. The realtor shows you only very run-down houses for $1,600 and then shows you a very nice house for $2,000. Might you ask each person to pay more in rent to get the $2,000 home? Why would the realtor show you the run-down houses and the nice house? The realtor may be challenging your anchoring bias. An anchoring bias  occurs when you focus on one piece of information when making a decision or solving a problem. In this case, you’re so focused on the amount of money you are willing to spend that you may not recognize what kinds of houses are available at that price point.

The confirmation bias is the tendency to focus on information that confirms your existing beliefs. For example, if you think that your professor is not very nice, you notice all of the instances of rude behavior exhibited by the professor while ignoring the countless pleasant interactions he is involved in on a daily basis. This bias proves that first impressions do matter and that we tend to look for information to confirm our initial judgments of others.

Watch this video from the Big Think to learn more about the confirmation bias.

You can view the transcript for “Confirmation Bias: Your Brain is So Judgmental” here (opens in new window) .

Hindsight bias leads you to believe that the event you just experienced was predictable, even though it really wasn’t. In other words, you knew all along that things would turn out the way they did. Representative bias  describes a faulty way of thinking, in which you unintentionally stereotype someone or something; for example, you may assume that your professors spend their free time reading books and engaging in intellectual conversation, because the idea of them spending their time playing volleyball or visiting an amusement park does not fit in with your stereotypes of professors.

Finally, the availability heuristic is a heuristic in which you make a decision based on an example, information, or recent experience that is that readily available to you, even though it may not be the best example to inform your decision . To use a common example, would you guess there are more murders or more suicides in America each year? When asked, most people would guess there are more murders. In truth, there are twice as many suicides as there are murders each year. However, murders seem more common because we hear a lot more about murders on an average day. Unless someone we know or someone famous takes their own life, it does not make the news. Murders, on the other hand, we see in the news every day. This leads to the erroneous assumption that the easier it is to think of instances of something, the more often that thing occurs.

Watch the following video for an example of the availability heuristic.

You can view the transcript for “Availability Heuristic: Are Planes More Dangerous Than Cars?” here (opens in new window) .

Biases tend to “preserve that which is already established—to maintain our preexisting knowledge, beliefs, attitudes, and hypotheses” (Aronson, 1995; Kahneman, 2011). These biases are summarized in Table 2 below.

Learn more about heuristics and common biases through the article, “ 8 Common Thinking Mistakes Our Brains Make Every Day and How to Prevent Them ” by Belle Beth Cooper.

You can also watch this clever music video explaining these and other cognitive biases.

Which type of bias do you recognize in your own decision making processes? How has this bias affected how you’ve made decisions in the past and how can you use your awareness of it to improve your decisions making skills in the future?

The word language written on the chalkboard with a silhouette of children in front of the chalkboard.

  • Understand how the use of language develops
  • Explain the relationship between language and thinking

Language Development

Language is a communication system that involves using words and systematic rules to organize those words to transmit information from one individual to another. While language is a form of communication, not all communication is language. Many species communicate with one another through their postures, movements, odors, or vocalizations. This communication is crucial for species that need to interact and develop social relationships with their conspecifics. However, many people have asserted that it is language that makes humans unique among all of the animal species (Corballis & Suddendorf, 2007; Tomasello & Rakoczy, 2003). This section will focus on what distinguishes language as a special form of communication, how the use of language develops, and how language affects the way we think.

Components of Language

Language , be it spoken, signed, or written, has specific components: a lexicon and grammar. Lexicon refers to the words of a given language. Thus, lexicon is a language’s vocabulary. Grammar  refers to the set of rules that are used to convey meaning through the use of the lexicon (Fernández & Cairns, 2011). For instance, English grammar dictates that most verbs receive an “-ed” at the end to indicate past tense.

Words are formed by combining the various phonemes that make up the language. A phoneme  (e.g., the sounds “ah” vs. “eh”) is a basic sound unit of a given language, and different languages have different sets of phonemes. Phonemes are combined to form morphemes , which are the smallest units of language that convey some type of meaning (e.g., “I” is both a phoneme and a morpheme).  Further, a morpheme is not the same as a word. The main difference is that a morpheme sometimes does not stand alone, but a word, by definition, always stands alone.

We use semantics and syntax to construct language. Semantics and syntax are part of a language’s grammar. Semantics refers to the process by which we derive meaning from morphemes and words. Syntax  refers to the way words are organized into sentences (Chomsky, 1965; Fernández & Cairns, 2011).

We apply the rules of grammar to organize the lexicon in novel and creative ways, which allow us to communicate information about both concrete and abstract concepts. We can talk about our immediate and observable surroundings as well as the surface of unseen planets. We can share our innermost thoughts, our plans for the future, and debate the value of a college education. We can provide detailed instructions for cooking a meal, fixing a car, or building a fire. The flexibility that language provides to relay vastly different types of information is a property that makes language so distinct as a mode of communication among humans.

Given the remarkable complexity of a language, one might expect that mastering a language would be an especially arduous task; indeed, for those of us trying to learn a second language as adults, this might seem to be true. However, young children master language very quickly with relative ease. B. F. Skinner (1957) proposed that language is learned through reinforcement. Noam Chomsky (1965) criticized this behaviorist approach, asserting instead that the mechanisms underlying language acquisition are biologically determined. The use of language develops in the absence of formal instruction and appears to follow a very similar pattern in children from vastly different cultures and backgrounds. It would seem, therefore, that we are born with a biological predisposition to acquire a language (Chomsky, 1965; Fernández & Cairns, 2011). Moreover, it appears that there is a critical period for language acquisition, such that this proficiency at acquiring language is maximal early in life; generally, as people age, the ease with which they acquire and master new languages diminishes (Johnson & Newport, 1989; Lenneberg, 1967; Singleton, 1995).

Children begin to learn about language from a very early age (Table 1). In fact, it appears that this is occurring even before we are born. Newborns show preference for their mother’s voice and appear to be able to discriminate between the language spoken by their mother and other languages. Babies are also attuned to the languages being used around them and show preferences for videos of faces that are moving in synchrony with the audio of spoken language versus videos that do not synchronize with the audio (Blossom & Morgan, 2006; Pickens, 1994; Spelke & Cortelyou, 1981).

Dig Deeper: The Case of Genie

In the fall of 1970, a social worker in the Los Angeles area found a 13-year-old girl who was being raised in extremely neglectful and abusive conditions. The girl, who came to be known as Genie, had lived most of her life tied to a potty chair or confined to a crib in a small room that was kept closed with the curtains drawn. For a little over a decade, Genie had virtually no social interaction and no access to the outside world. As a result of these conditions, Genie was unable to stand up, chew solid food, or speak (Fromkin, Krashen, Curtiss, Rigler, & Rigler, 1974; Rymer, 1993). The police took Genie into protective custody.

Genie’s abilities improved dramatically following her removal from her abusive environment, and early on, it appeared she was acquiring language—much later than would be predicted by critical period hypotheses that had been posited at the time (Fromkin et al., 1974). Genie managed to amass an impressive vocabulary in a relatively short amount of time. However, she never developed a mastery of the grammatical aspects of language (Curtiss, 1981). Perhaps being deprived of the opportunity to learn language during a critical period impeded Genie’s ability to fully acquire and use language.

You may recall that each language has its own set of phonemes that are used to generate morphemes, words, and so on. Babies can discriminate among the sounds that make up a language (for example, they can tell the difference between the “s” in vision and the “ss” in fission); early on, they can differentiate between the sounds of all human languages, even those that do not occur in the languages that are used in their environments. However, by the time that they are about 1 year old, they can only discriminate among those phonemes that are used in the language or languages in their environments (Jensen, 2011; Werker & Lalonde, 1988; Werker & Tees, 1984).

After the first few months of life, babies enter what is known as the babbling stage, during which time they tend to produce single syllables that are repeated over and over. As time passes, more variations appear in the syllables that they produce. During this time, it is unlikely that the babies are trying to communicate; they are just as likely to babble when they are alone as when they are with their caregivers (Fernández & Cairns, 2011). Interestingly, babies who are raised in environments in which sign language is used will also begin to show babbling in the gestures of their hands during this stage (Petitto, Holowka, Sergio, Levy, & Ostry, 2004).

Generally, a child’s first word is uttered sometime between the ages of 1 year to 18 months, and for the next few months, the child will remain in the “one word” stage of language development. During this time, children know a number of words, but they only produce one-word utterances. The child’s early vocabulary is limited to familiar objects or events, often nouns. Although children in this stage only make one-word utterances, these words often carry larger meaning (Fernández & Cairns, 2011). So, for example, a child saying “cookie” could be identifying a cookie or asking for a cookie.

As a child’s lexicon grows, she begins to utter simple sentences and to acquire new vocabulary at a very rapid pace. In addition, children begin to demonstrate a clear understanding of the specific rules that apply to their language(s). Even the mistakes that children sometimes make provide evidence of just how much they understand about those rules. This is sometimes seen in the form of overgeneralization . In this context, overgeneralization refers to an extension of a language rule to an exception to the rule. For example, in English, it is usually the case that an “s” is added to the end of a word to indicate plurality. For example, we speak of one dog versus two dogs. Young children will overgeneralize this rule to cases that are exceptions to the “add an s to the end of the word” rule and say things like “those two gooses” or “three mouses.” Clearly, the rules of the language are understood, even if the exceptions to the rules are still being learned (Moskowitz, 1978).

Language and Thinking

Think about it:  the meaning of language.

Think about what you know of other languages; perhaps you even speak multiple languages. Imagine for a moment that your closest friend fluently speaks more than one language. Do you think that friend thinks differently, depending on which language is being spoken? You may know a few words that are not translatable from their original language into English. For example, the Portuguese word saudade originated during the 15th century, when Portuguese sailors left home to explore the seas and travel to Africa or Asia. Those left behind described the emptiness and fondness they felt as saudade (Figure 20) . The word came to express many meanings, including loss, nostalgia, yearning, warm memories, and hope. There is no single word in English that includes all of those emotions in a single description. Do words such as saudade indicate that different languages produce different patterns of thought in people? What do you think??

Photograph A shows a painting of a person leaning against a ledge, slumped sideways over a box. Photograph B shows a painting of a person reading by a window.

Language may indeed influence the way that we think, an idea known as linguistic determinism. One recent demonstration of this phenomenon involved differences in the way that English and Mandarin Chinese speakers talk and think about time. English speakers tend to talk about time using terms that describe changes along a horizontal dimension, for example, saying something like “I’m running behind schedule” or “Don’t get ahead of yourself.” While Mandarin Chinese speakers also describe time in horizontal terms, it is not uncommon to also use terms associated with a vertical arrangement. For example, the past might be described as being “up” and the future as being “down.” It turns out that these differences in language translate into differences in performance on cognitive tests designed to measure how quickly an individual can recognize temporal relationships. Specifically, when given a series of tasks with vertical priming, Mandarin Chinese speakers were faster at recognizing temporal relationships between months. Indeed, Boroditsky (2001) sees these results as suggesting that “habits in language encourage habits in thought” (p. 12).

Language does not completely determine our thoughts—our thoughts are far too flexible for that—but habitual uses of language can influence our habit of thought and action. For instance, some linguistic practice seems to be associated even with cultural values and social institution. Pronoun drop is the case in point. Pronouns such as “I” and “you” are used to represent the speaker and listener of a speech in English. In an English sentence, these pronouns cannot be dropped if they are used as the subject of a sentence. So, for instance, “I went to the movie last night” is fine, but “Went to the movie last night” is not in standard English. However, in other languages such as Japanese, pronouns can be, and in fact often are, dropped from sentences. It turned out that people living in those countries where pronoun drop languages are spoken tend to have more collectivistic values (e.g., employees having greater loyalty toward their employers) than those who use non–pronoun drop languages such as English (Kashima & Kashima, 1998). It was argued that the explicit reference to “you” and “I” may remind speakers the distinction between the self and other, and the differentiation between individuals. Such a linguistic practice may act as a constant reminder of the cultural value, which, in turn, may encourage people to perform the linguistic practice.

One group of researchers who wanted to investigate how language influences thought compared how English speakers and the Dani people of Papua New Guinea think and speak about color. The Dani have two words for color: one word for light and one word for dark . In contrast, the English language has 11 color words. Researchers hypothesized that the number of color terms could limit the ways that the Dani people conceptualized color. However, the Dani were able to distinguish colors with the same ability as English speakers, despite having fewer words at their disposal (Berlin & Kay, 1969). A recent review of research aimed at determining how language might affect something like color perception suggests that language can influence perceptual phenomena, especially in the left hemisphere of the brain. You may recall from earlier chapters that the left hemisphere is associated with language for most people. However, the right (less linguistic hemisphere) of the brain is less affected by linguistic influences on perception (Regier & Kay, 2009)

Learn more about language, language acquisition, and especially the connection between language and thought in the following CrashCourse video:

You can view the transcript for “Language: Crash Course Psychology #16” here (opens in new window) .

In this chapter, you learned to

  • describe attention
  • describe cognition and problem-solving strategies
  • describe language acquisition and the role language plays in communication and thought

You learned about non-memory cognitive processes in this chapter. Because each of you reading this is using language in some shape or form, we will end with a quick summary and a video on this topic. Language is a communication system that has both a lexicon and a system of grammar. Language acquisition occurs naturally and effortlessly during the early stages of life, and this acquisition occurs in a predictable sequence for individuals around the world. Language has a strong influence on thought, and the concept of how language may influence cognition remains an area of study and debate in psychology.

In this TED talk, Lera Boroditsky summarizes unique ways that language and culture intersect with some basic cognitive processes. How was your language shaped your thinking?

Abler, W. (2013). Sapir, Harris, and Chomsky in the twentieth century. Cognitive Critique, 7, 29–48.

Aronson, E. (Ed.). (1995). Social cognition. In The social animal (p. 151). New York: W.H. Freeman and Company.

Bartlett, F. C. (1932). Remembering: A study in experimental and social psychology. Cambridge, England: Cambridge University Press.

Bayer, J. B., & Campbell, S. W. (2012). Texting while driving on automatic: Considering the frequency-independent side of habit. Computers in Human Behavior, 28, 2083–2090.

Beilock, S. L., & Carr, T. H. (2001). On the fragility of skilled performance: What governs choking under pressure?  Journal of Experimental Psychology: General, 130 , 701–725.

Berlin, B., & Kay, P. (1969). Basic color terms: Their universality and evolution. Berkley: University of California Press.

Blossom, M., & Morgan, J. L. (2006). Does the face say what the mouth says? A study of infants’ sensitivity to visual prosody. In the 30th annual Boston University Conference on Language Development, Somerville, MA.

Boroditsky, L. (2001). Does language shape thought? Mandarin and English speakers’ conceptions of time. Cognitive Psychology, 43, 1–22.

Boroditsky, L. (2011, February). How language shapes thought. Scientific American, 63–65.Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press

Broadbent, D. A. (1958).  Perception and communication . London, England: Pergamon Press.

Cairns Regional Council. (n.d.). Cultural greetings. Retrieved from http://www.cairns.qld.gov.au/__data/assets/pdf_file/0007/8953/CulturalGreetingExercise.pdf

Callero, P. L. (1994). From role-playing to role-using: Understanding role as resource. Social Psychology Quarterly, 57, 228–243.

Cherry, E. C. (1953). Experiments on the recognition of speech with one and two ears.  Journal of the Acoustical Society of America, 25 , 975–979.

Chomsky, N.(1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press

Corballis, M. C., & Suddendorf, T. (2007). Memory, time, and language. In C. Pasternak (Ed.), What makes us human (pp. 17–36). Oxford, UK: Oneworld Publications.

Curtiss, S. (1981). Dissociations between language and cognition: Cases and implications. Journal of Autism and Developmental Disorders, 11(1), 15–30.

Cyclopedia of Puzzles. (n.d.) Retrieved from http://www.mathpuzzle.com/loyd/

Deutsch, J. A., & Deutsch, D. (1963). Attention: some theoretical considerations.  Psychological Review, 70 , 80–90.

Fernández, E. M., & Cairns, H. S. (2011). Fundamentals of psycholinguistics. West Sussex, UK: Wiley-Blackwell.

Fromkin, V., Krashen, S., Curtiss, S., Rigler, D., & Rigler, M. (1974). The development of language in Genie: A case of language acquisition beyond the critical period. Brain and Language, 1, 81–107.

German, T. P., & Barrett, H. C. (2005). Functional fixedness in a technologically sparse culture. Psychological Science, 16, 1–5.

Goldstone, R. L., & Kersten, A. (2003). Concepts and categorization. In A. F. Healy, R. W. Proctor, & I.B. Weiner (Eds.), Handbook of psychology (Volume IV, pp. 599–622). Hoboken, New Jersey: John Wiley & Sons, Inc.

Hirst, W. C., Neisser, U., & Spelke, E. S. (1978). Divided attention.  Human Nature, 1 , 54–61.

James, W. (1983).  The principles of psychology . Cambridge, MA: Harvard University Press. (Original work published 1890)

Jensen, J. (2011). Phoneme acquisition: Infants and second language learners. The Language Teacher, 35(6), 24–28.

Johnson, J. S., & Newport, E. L. (1989). Critical period effects in second language learning: The influence of maturational state on the acquisition of English as a second language. Cognitive Psychology, 21, 60–99.

Johnston, W. A., & Heinz, S. P. (1978). Flexibility and capacity demands of attention.  Journal of Experimental Psychology: General, 107 , 420–435.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus, and Giroux.

Lenneberg, E. (1967). Biological foundations of language. New York: Wiley.

Monsell, S. (2003). Task switching.  Trends in Cognitive Science, 7 (3), 134–140.

Moray, N. (1959). Attention in dichotic listening: Affective cues and the influence of instructions.  Quarterly Journal of Experimental Psychology, 11 , 56–60.

Moskowitz, B. A. (1978). The acquisition of language. Scientific American, 239, 92–108. Petitto, L. A., Holowka, S., Sergio, L. E., Levy, B., & Ostry, D. J. (2004). Baby hands that move to the rhythm of language: Hearing babies acquiring sign languages babble silently on the hands. Cognition, 93, 43–73.

Neyfakh, L. (2013, October 7). “Why you can’t stop checking your phone.” Retrieved from http://www.bostonglobe.com/ideas/2013/10/06/why-you-can-stop-checking-your-phone/rrBJzyBGDAr1YlEH5JQDcM/story.html

Petitto, L. A., Holowka, S., Sergio, L. E., Levy, B., & Ostry, D. J. (2004). Baby hands that move to the rhythm of language: Hearing babies acquiring sign languages babble silently on the hands. Cognition, 93, 43–73.

Pickens, J. (1994). Full-term and preterm infants’ perception of face-voice synchrony. Infant Behavior and Development, 17, 447–455.

Pratkanis, A. (1989). The cognitive representation of attitudes. In A. R. Pratkanis, S. J. Breckler, & A. G. Greenwald (Eds.), Attitude structure and function (pp. 71–98). Hillsdale, NJ: Erlbaum.

Regier, T., & Kay, P. (2009). Language, thought, and color: Whorf was half right. Trends in Cognitive Sciences, 13(10), 439–446.

Rymer, R. (1993). Genie: A Scientific Tragedy. New York: Harper Collins.

Sapir, E. (1964). Culture, language, and personality. Berkley: University of California Press. (Original work published 1941)

Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events.  Perception, 28 , 1059–1074.

Skinner, B. F. (1957). Verbal behavior. Acton, MA: Copley Publishing Group.

Spelke, E. S., & Cortelyou, A. (1981). Perceptual aspects of social knowing: Looking and listening in infancy. In M.E. Lamb & L.R. Sherrod (Eds.), Infant social cognition: Empirical and theoretical considerations (pp. 61–83). Hillsdale, NJ: Erlbaum.

Spelke, E. S., Hirst, W. C., & Neisser, U. (1976). Skills of divided attention.  Cognition, 4 , 215–250.

Strayer, D. L., & Drews, F. A. (2007). Cell-phone induced inattention blindness.  Current Directions in Psychological Science, 16 , 128–131.

Strayer, D. L., & Johnston, W. A. (2001). Driven to distraction: Dual-task studies of simulated driving and conversing on a cellular telephone.  Psychological Science, 12 , 462–466.

Strayer, D. L., Watson, J. M., & Drews, F. A. (2011) Cognitive distraction while multitasking in the automobile. In Brian Ross (Ed.),  The Psychology of Learning and Motivation  (Vol. 54, pp. 29–58). Burlington, VT: Academic Press.

Tomasello, M., & Rakoczy, H. (2003). What makes human cognition unique? From individual to shared to collective intentionality. Mind & Language, 18(2), 121–147.

Treisman, A. (1960). Contextual cues in selective listening.  Quarterly Journal of Experimental Psychology, 12 , 242–248.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

van Troyer, G. (1994). Linguistic determinism and mutability: The Sapir-Whorf “hypothesis” and intercultural communication. JALT Journal, 2, 163–178.

Watson, J. M., & Strayer, D. L. (2010). Supertaskers: Profiles in extraordinary multitasking ability.  Psychonomic Bulletin & Review, 17 , 479–485.

Werker, J. F., & Lalonde, C. E. (1988). Cross-language speech perception: Initial capabilities and developmental change. Developmental Psychology, 24, 672–683.

Werker, J. F., & Tees, R. C. (1984). Cross-language speech perception: Evidence for perceptual reorganization during the first year of life. Infant Behavior and Development, 7, 49–63.

Whorf, B. L. (1956). Language, thought and relativity. Cambridge, MA: MIT Press.

CC original content.

Attention, Thinking and Language.  Authored by:  Karenna Malavanti Provided by: PressBooks. License: CC BY-SA: Attribution-ShareAlike

CC licensed content, Shared previously

  • Why It Matters: Thinking and Intelligence.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/introduction-10/
  • Attention. Authored by: Frances Friedrich. Located at NOBA Psychology. License: CC-BY-NC-SA. Retrieved from: Retrieved from  http://noba.to/uv9x8df5
  • Introduction to Thinking and Intelligence. Authored by : OpenStax College. License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction   Located at : https://openstax.org/books/psychology-2e/pages/7-introduction .
  • What Is Cognition?. Authored by : OpenStax College. License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction   Located at : https://openstax.org/books/psychology-2e/pages/7-1-what-is-cognition .
  • A Thinking Man Image. Authored by : Wesley Nitsckie. License : CC BY-SA: Attribution-ShareAlike   Located at : https://www.flickr.com/photos/nitsckie/5507777269 .
  • What Is Cognition?.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/what-is-cognition/
  • Categories and Concepts. Authored by : Gregory Murphy. Provided by : New York University. Project : The Noba Project. License : CC BY-NC-SA: Attribution-NonCommercial-ShareAlike   Located at : http://nobaproject.com/textbooks/wendy-king-introduction-to-psychology-the-full-noba-collection/modules/categories-and-concepts .
  • Solving Problems.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at : https://pressbooks.online.ucf.edu/lumenpsychology/chapter/problem-solving/
  • Problem-Solving. Authored by : OpenStax College. License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction . Located at : https://openstax.org/books/psychology-2e/pages/7-3-problem-solving .
  • Pitfalls to Problem Solving.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/reading-pitfalls-to-problem/
  • Introduction to Language.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/outcome-language/
  • Language. Authored by : OpenStax College.  License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction   Located at : https://openstax.org/books/psychology-2e/pages/7-2-language .
  • Language. Authored by : geralt. Provided by : Pixabay. License : CC0: No Rights Reserved   Located at : https://pixabay.com/en/school-board-languages-blackboard-1063556/ .
  • Language and Language Use.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/language-and-language-use/
  • Language and Language Use. Authored by : Yoshihisa Kashima. Project : The Noba Project. License : CC BY-NC-SA: Attribution-NonCommercial-ShareAlike   Located at : http://nobaproject.com/textbooks/introduction-to-psychology-the-full-noba-collection/modules/language-and-language-use .
  • Language Development.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/language/
  • Morpheme. Provided by : Wikipedia. License : CC BY-SA: Attribution-ShareAlike   Located at : https://en.wikipedia.org/wiki/Morpheme .
  • Language and Thinking.  Authored by : Lumen Learning License :  CC BY: Attribution   Located at :  https://pressbooks.online.ucf.edu/lumenpsychology/chapter/reading-language-and-thought/
  • Summary. Authored by : OpenStax College. License : CC BY: Attribution . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction . Located at : https://openstax.org/books/psychology-2e/pages/7-summary .

All rights reserved content

  • Cognition: How Your Mind Can Amaze and Betray You – Crash Course Psychology #15. Provided by : CrashCourse. License : All Rights Reserved . License Terms : Standard YouTube License   Located at : https://www.youtube.com/watch?v=R-sVnmmw6WY&feature=youtu.be&list=PL8dPuuaLjXtOPRKzVLY0jJY-uHOH9KVU6 .
  • Can you solve Einsteinu2019s Riddle? . Authored by : Dan Van der Vieren. Provided by : Ted-Ed. License : Other . License Terms : Standard YouTube License .  Located at : https://www.youtube.com/watch?v=1rDVz_Fb6HQ&index=3&list=PLUmyCeox8XCwB8FrEfDQtQZmCc2qYMS5a .
  • Language: Crash Course Psychology #16. Authored by : CrashCourse. License : Other . License Terms : Standard YouTube License .  Located at : https://www.youtube.com/watch?v=s9shPouRWCs&feature=youtu.be&list=PL8dPuuaLjXtOPRKzVLY0jJY-uHOH9KVU6 .
  • How language shapes the way we think Authored by: Lera Boroditsky.  Provided by :  TED.  License : Other . License Terms : Standard YouTube License .  Located at :  https://youtu.be/RKK7wGAYP6k

thinking, including perception, learning, problem solving, judgment, and memory

field of psychology dedicated to studying every aspect of how people think

a set of objects that can be treated as equivalent in some way

category or grouping of linguistic information, objects, ideas, or life experiences

best representation of a concept

mental groupings that are created “naturally” through your experiences

concept that is defined by a very specific set of characteristics

(plural = schemata) mental construct consisting of a cluster or collection of related concepts

set of expectations that define the behaviors of a person occupying a particular role

set of behaviors that are performed the same way each time; also referred to as a cognitive script

set of behaviors that are performed the same way each time; also referred to as an event schema

method for solving problems

problem-solving strategy in which multiple solutions are attempted until the correct one is found

problem-solving strategy characterized by a specific set of instructions

mental shortcut that saves time when solving a problem

heuristic in which you begin to solve a problem by focusing on the end result

continually using an old solution to a problem without results

inability to see an object as useful for any other use other than the one for which it was intended

faulty heuristic in which you fixate on a single aspect of a problem to find a solution

belief that the event just experienced was predictable, even though it really wasn’t

subset of the population that accurately represents the general population

faulty heuristic in which you make a decision based on information readily available to you

communication system that involves using words to transmit information from one individual to another

Words and expressions.

set of rules that are used to convey meaning through the use of a lexicon

basic sound unit of a given language

smallest unit of language that conveys some type of meaning

process by which we derive meaning from morphemes and words

manner by which words are organized into sentences

extension of a rule that exists in a given language to an exception to the rule

Psychological Science: Understanding Human Behavior Copyright © by Karenna Malavanti is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

7.1 What is Cognition?

Learning objectives.

By the end of this section, you will be able to:

  • Describe cognition
  • Distinguish concepts and prototypes
  • Explain the difference between natural and artificial concepts

   Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. The infinite amount of sub-routines we organize every day to make up larger behaviors such as driving, operating machinery, participating in sports or even holding conversations (all relatively new behaviors in terms of the evolution of a species) go unnoticed but together allow us to navigate our environment safely and efficiently. There are facets to the multitude of complex processes involved in human cognition and what we understand about animal thought processes. Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem-solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

COGNITIVE PSYCHOLOGY – A BRIEF HISTORY

Although discussions and descriptions of thought processes date back millennia to societies such as the ancient Greeks, Egyptians and Maya, the formal scientific study of cognition is relatively new, growing out of philosophical debates including Rene Descartes’ 16th century argument suggesting humans are born with innate knowledge and the that the mind and body reflect two different entities.  This theory was known as substance dualism. From Descartes’ theories in the 16th century, major debates formed on whether human thought is created solely through the stimulation of our sense organs (empiricism) or whether we are born with innate knowledge which allows us to form language and maintain conscious experience (nativism). Supporters of empiricist views included philosophers such as George Berkeley, an Irish bishop who denied the existence of material substance, suggesting instead that the objects we interact with are only ideas in the minds of the perceivers, and John Locke, an English philosopher who founded the study of theory of mind which lead to modern conceptions of identity and the self.  Supporters of nativism included Immanuel Kant, a German philosopher who argued that the human mind creates the structure of human experience and that the world (as it is) is independent of humanity’s perception of it. These arguments in philosophy would later lead to important advancements in the 19th century by Paul Broca and Carl Wernicke. Paul Broca, a French physician, anatomist, and anthropologist treated a patient now known as “Tan”, who with the exception of some curse words, could only create the utterance “tan” when he tried to speak. After the patient died, Dr. Broca inspected his brain and discovered that a specific area of the lateral frontal cortex (now known as “ Broca’s area “) was damaged. He concluded that Broca’s areas was an important processing center for language production. Shortly after Broca’s publication documenting language deficits related to damage in the lateral frontal cortex, the German physician, psychiatrist, and anatomist, Carl Wernicke, noticed that not all language deficits were related to damage to Broca’s area. Wernicke found that damage to the left posterior and superior temporal gyrus resulted in deficits in language comprehension as opposed to language production . This area of the brain is what we now refer to as Wernicke’s area and these two findings together provided important evidence for theories related to functional localization within the brain, a theory separate from previous ideas related to the study of phrenology.

Around the turn of the 20th century, experimental research conducted in the experimental labs of Wilhelm Wundt and Ernst Weber in German, and Charles Bell in Britain, lead to the experimental study of behavior.  Edward Thorndike’s Law of Effect (1898) described how behavior can be shaped by conditions and patterns of reinforcement. Theories in behaviorism were popular until the 1920s, when Jean Piaget began studying thoughts, language, and intelligence as well as how these capabilities change over the course of human development and aging. While WWII was taking the lives of millions of humans across the globe, psychology searched for new and innovative ways of studying human performance in order to address questions such as how to best train soldiers to use new technology, and how attention might be affected by stress. This research eventually lead to Claude Shannon’s developments in information theory in 1948, which described the quantification, storage, and communication of information. Developments in computer science soon led to parallels between human thought processes and computer information processing.  Newell and Simon’s development of artificial intelligence (AI) described both advanced capabilities in computing and descriptive models of cognitive processes. In responses to behaviorists’ criticisms of analyzing and modeling thought processes, Noam Chompsky argued against B.F. Skinner’s views that language is learned through reinforcement, suggesting that Skinner ignored the human creativity found in linguistics. Within the same decade George Miller published research describing humans’ ability to maintain information while performing secondary tasks (Miller, 1956) and founded the Harvard Center for Cognitive Studies. Soon after, the first Cognitive Psychology textbook was published in 1967 by Ulrich Neisser (1967), a former student of George Miller.  Neisser was influenced by Gestalt psychologists, Wolfgang Kohler and Hans Wallach, as well as MIT computer scientist Oliver Selfridge. Neisser’s definition of the new term “cognition” illustrates the then progressive concept of cognitive processes as:

“all processes by which the sensory input is transformed, reduced, elaborated, stored, recovered, and used. It is concerned with these processes even when they operate in the absence of relevant stimulation, as in images and hallucinations. . . Given such a sweeping definition, it is apparent that cognition is involved in everything a human being might possibly do; that every psychological phenomenon is a cognitive phenomenon. But although cognitive psychology is concerned with all human activity rather than some fraction of it, the concern is from a particular point of view. Other viewpoints are equally legitimate and necessary. Dynamic psychology, which begins with motives rather than with sensory input, is a case in point. Instead of asking how a man’s actions and experiences result from what he saw, remembered, or believed, the dynamic psychologist asks how they follow from the subject’s goals, needs, or instincts.” (page 4 of Neisser’s 1967 publication of  Cognitive Psychology)

   Upon waking each morning, you begin thinking—contemplating the tasks that you must complete that day. In what order should you run your errands? Should you go to the bank, the cleaners, or the grocery store first? Can you get these things done before you head to class or will they need to wait until school is done? These thoughts are one example of cognition at work. Exceptionally complex, cognition is an essential feature of human consciousness, yet not all aspects of cognition are consciously experienced. For example, many decisions we make about choosing to do something or refraining from doing something involve cognitive processes related to weighing options and making comparisons to other events in memory. However, cognition has been argued to not be involved in all the actions we make, such as reflexes that recoil your hand after touching an extremely hot surface, which operate on automatic feedback loops between the effector and spinal cord. Cognition is described in the Oxford dictionary as the mental actions or processes involved in acquiring, maintaining and understanding knowledge through thought, experience and the senses (definition of Cognition from the English Oxford Dictionary, 2018), and is described by Licht, Hull and Ballantyne (2014) as the mental activity associated with obtaining, converting and using knowledge .  It is important to recognize that although the term Cognition is an umbrella term that encompasses many different mental processes, similarities exist between how groups define cognition by defining it as a variety of mental processes that allow us to maintain, understand and use information to create knowledge and reflect upon it. Within the pieces that make up cognition, a main component is what is commonly referred to as thinking, which Matlin (2009) has defined as coming  to a decision, reaching a solution, forming a belief, or developing an attitude. Again, we see that even a subcomponent of cognition, such as thinking, still represents somewhat of an umbrella term which can be broken up into groups of processes and procedures that make up our thinking. Definitions are not universally accepted, and some groups within psychology consider cognition and thinking as the same group of processes. However, we will use the definitions provided above for the sake of simplicity.

Cognitive psychology is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes. Cognitive psychologists strive to determine and measure different types of intelligence, why some people are better at problem solving than others, and how emotional intelligence affects success in the workplace, among countless other topics. They also sometimes focus on how we organize thoughts and information gathered from our environments into meaningful categories of thought, which will be discussed later. Basically, cognitive scientists work to define the smallest components of what make up broader topics in cognition in order to continue improving working definitions of how we conceptualize human cognition. Many techniques have been discovered that allow psychologists to selectively evaluate and compare different components of cognition. Modern advancements in technology have allowed psychologist to use these methods to collect various forms of cognitive data such as basic measurements of reaction times and response accuracies to more advance techniques of physiological responses, such as eye tracking, electromyography (EMG), electroencephalography (EEG), functional magnetic resonance imaging (fMRI), magnet encephalography (MEG), and positron emission tomography. Cognitive scientists work to create experimental designs using these methods, generate new findings, publish their work and add to the world-wide discussion of how various cognitive processes work and what makes our life experience similar or different from other species.

CONCEPTS AND PROTOTYPES

   The human nervous system is capable of handling endless streams of information, as emphasized in the sensation and perception chapter. The senses serve as the interface between the mind and the external environment, receiving stimuli and translating it into nervous impulses that are transmitted to the brain. The brain then processes this information and uses the relevant pieces, which are held in working memory, later expressed through language, or stored in memory for future use. To make this process more complex, the brain does not gather information from external environments only. When thoughts are formed, the brain pulls information from emotions and memories (figure below). Emotion and memory are powerful influences on both our thoughts and behaviors.

Sensations and information are received by our brains, filtered through emotions and memories, and processed to become thoughts.

   In order to organize this staggering amount of information, the brain has developed a file cabinet of sorts in the mind. The different files stored in the file cabinet are called concepts . Concepts are categories or groupings of linguistic information, images, ideas, or memories, such as life experiences. Concepts are, in many ways, big ideas that are generated by observing details, and categorizing and combining these details into cognitive structures. You use concepts to see the relationships among the different elements of your experiences and to keep the information in your mind organized and accessible.

Concepts are informed by our semantic memory (you will learn more about semantic memory in a later chapter) and are present in every aspect of our lives; however, one of the easiest places to notice concepts is inside a classroom, where they are discussed explicitly. When you study United States history, for example, you learn about more than just individual events that have happened in America’s past. You absorb a large quantity of information by listening to and participating in discussions, examining maps, and reading first-hand accounts of people’s lives. Your brain analyzes these details and develops an overall understanding of American history. In the process, your brain gathers details that inform and refine your understanding of related concepts like democracy, power, and freedom.

Concepts can be complex and abstract, like justice, or more concrete, like types of birds. In psychology, for example, Piaget’s stages of development are abstract concepts. Some concepts, like tolerance, are agreed upon by many people, because they have been used in various ways over many years. Other concepts, like the characteristics of your ideal friend or your family’s birthday traditions, are personal and individualized. In this way, concepts touch every aspect of our lives, from our many daily routines to the guiding principles behind the way governments function.

HIERARCHIES OF CONCEPTS

Concepts can be understood by considering how they can be organized into  hierarchies . At the top are superordinate concepts. This is the broadest category which encompasses all the objects belonging to a concept. The subordinate concept of “furniture” covers everything from couches to nightstands. If we were to narrow our focus to include only couches, we are considering the midlevel or basic level of the hierarchy. This is still a fairly broad category, but not quite as broad as the superordinate concept of furniture. The midlevel category is what we use most often in everyday life to identify objects. Sub-ordinate concepts are even narrower, referring to specific types. To continue with our example, this would include loveseat, a La-Z-Boy, or sectional.

Another technique used by your brain to organize information is the identification of prototypes for the concepts you have developed. A prototype is the best example or representation of a concept. For example, for the category of civil disobedience, your prototype could be Rosa Parks. Her peaceful resistance to segregation on a city bus in Montgomery, Alabama, is a recognizable example of civil disobedience. Or your prototype could be Mohandas Gandhi, sometimes called Mahatma Gandhi (“Mahatma” is an honorific title).

In 1930, Mohandas Gandhi led a group in peaceful protest against a British tax on salt in India.

Mohandas Gandhi served as a nonviolent force for independence for India while simultaneously demanding that Buddhist, Hindu, Muslim, and Christian leaders—both Indian and British—collaborate peacefully. Although he was not always successful in preventing violence around him, his life provides a steadfast example of the civil disobedience prototype (Constitutional Rights Foundation, 2013). Just as concepts can be abstract or concrete, we can make a distinction between concepts that are functions of our direct experience with the world and those that are more artificial in nature.

NATURAL AND ARTIFICIAL CONCEPTS

    In psychology, concepts can be divided into two categories, natural and artificial. Natural concepts are created “naturally” through your experiences and can be developed from either direct or indirect experiences. For example, if you live in Essex Junction, Vermont, you have probably had a lot of direct experience with snow. You’ve watched it fall from the sky, you’ve seen lightly falling snow that barely covers the windshield of your car, and you’ve shoveled out 18 inches of fluffy white snow as you’ve thought, “This is perfect for skiing.” You’ve thrown snowballs at your best friend and gone sledding down the steepest hill in town. In short, you know snow. You know what it looks like, smells like, tastes like, and feels like. If, however, you’ve lived your whole life on the island of Saint Vincent in the Caribbean, you may never have actually seen snow, much less tasted, smelled, or touched it. You know snow from the indirect experience of seeing pictures of falling snow—or from watching films that feature snow as part of the setting. Either way, snow is a natural concept because you can construct an understanding of it through direct observations or experiences of snow.

(a) Our concept of snow is an example of a natural concept—one that we understand through direct observation and experience. (b) In contrast, artificial concepts are ones that we know by a specific set of characteristics that they always exhibit, such as what defines different basic shapes. (credit a: modification of work by Maarten Takens; credit b: modification of work by “Shayan (USA)”/Flickr)

   An artificial concept, on the other hand, is a concept that is defined by a specific set of characteristics. Various properties of geometric shapes, like squares and triangles, serve as useful examples of artificial concepts. A triangle always has three angles and three sides. A square always has four equal sides and four right angles. Mathematical formulas, like the equation for area (length × width) are artificial concepts defined by specific sets of characteristics that are always the same. Artificial concepts can enhance the understanding of a topic by building on one another. For example, before learning the concept of “area of a square” (and the formula to find it), you must understand what a square is. Once the concept of “area of a square” is understood, an understanding of area for other geometric shapes can be built upon the original understanding of area. The use of artificial concepts to define an idea is crucial to communicating with others and engaging in complex thought. According to Goldstone and Kersten (2003), concepts act as building blocks and can be connected in countless combinations to create complex thoughts.

A schema is a mental construct consisting of a cluster or collection of related concepts (Bartlett, 1932). There are many different types of schemata, and they all have one thing in common: schemata are a method of organizing information that allows the brain to work more efficiently. When a schema is activated, the brain makes immediate assumptions about the person or object being observed.

There are several types of schemata. A role schema makes assumptions about how individuals in certain roles will behave (Callero, 1994). For example, imagine you meet someone who introduces himself as a firefighter. When this happens, your brain automatically activates the “firefighter schema” and begins making assumptions that this person is brave, selfless, and community-oriented. Despite not knowing this person, already you have unknowingly made judgments about him. Schemata also help you fill in gaps in the information you receive from the world around you. While schemata allow for more efficient information processing, there can be problems with schemata, regardless of whether they are accurate: Perhaps this particular firefighter is not brave, he just works as a firefighter to pay the bills while studying to become a children’s librarian.

An event schema, also known as a cognitive script, is a set of behaviors that can feel like a routine. Think about what you do when you walk into an elevator. First, the doors open and you wait to let exiting passengers leave the elevator car. Then, you step into the elevator and turn around to face the doors, looking for the correct button to push. You never face the back of the elevator, do you? And when you’re riding in a crowded elevator and you can’t face the front, it feels uncomfortable, doesn’t it? Interestingly, event schemata can vary widely among different cultures and countries. For example, while it is quite common for people to greet one another with a handshake in the United States, in Tibet, you greet someone by sticking your tongue out at them, and in Belize, you bump fists (Cairns Regional Council, n.d.)

What event schema do you perform when riding in an elevator? (credit: “Gideon”/Flickr)

   Because event schemata are automatic, they can be difficult to change. Imagine that you are driving home from work or school. This event schema involves getting in the car, shutting the door, and buckling your seatbelt before putting the key in the ignition. You might perform this script two or three times each day. As you drive home, you hear your phone’s ring tone. Typically, the event schema that occurs when you hear your phone ringing involves locating the phone and answering it or responding to your latest text message. So without thinking, you reach for your phone, which could be in your pocket, in your bag, or on the passenger seat of the car. This powerful event schema is informed by your pattern of behavior and the pleasurable stimulation that a phone call or text message gives your brain. Because it is a schema, it is extremely challenging for us to stop reaching for the phone, even though we know that we endanger our own lives and the lives of others while we do it (Neyfakh, 2013).

Texting while driving is dangerous, but it is a difficult event schema for some people to resist.

   Remember the elevator? It feels almost impossible to walk in and not face the door. Our powerful event schema dictates our behavior in the elevator, and it is no different with our phones. Current research suggests that it is the habit, or event schema, of checking our phones in many different situations that makes refraining from checking them while driving especially difficult (Bayer & Campbell, 2012). Because texting and driving has become a dangerous epidemic in recent years, psychologists are looking at ways to help people interrupt the “phone schema” while driving. Event schemata like these are the reason why many habits are difficult to break once they have been acquired. As we continue to examine thinking, keep in mind how powerful the forces of concepts and schemata are to our understanding of the world.

   In this section, you were introduced to cognitive psychology, which is the study of cognition, or the brain’s ability to think, perceive, plan, analyze, and remember. Concepts and their corresponding prototypes help us quickly organize our thinking by creating categories into which we can sort new information. We also develop schemata, which are clusters of related concepts. Some schemata involve routines of thought and behavior, and these help us function properly in various situations without having to “think twice” about them. Schemata show up in social situations and routines of daily behavior.

References:

Openstax Psychology text by Kathryn Dumper, William Jenkins, Arlene Lacombe, Marilyn Lovett and Marion Perlmutter licensed under CC BY v4.0. https://openstax.org/details/books/psychology

Review Questions:

1. Cognitive psychology is the branch of psychology that focuses on the study of ________.

a. human development

b. human thinking

c. human behavior

d. human society

2. Which of the following is an example of a prototype for the concept of leadership on an athletic team?

a. the equipment manager

b. the scorekeeper

c. the team captain

d. the quietest member of the team

3. Which of the following is an example of an artificial concept?

b. a triangle’s area

c. gemstones

d. teachers

4. An event schema is also known as a cognitive ________.

a. stereotype

d. prototype

Critical Thinking Questions:

1. Describe an event schema that you would notice at a sporting event.

2. Explain why event schemata have so much power over human behavior.

Personal Application Question:

1. Describe a natural concept that you know fully but that would be difficult for someone else to understand and explain why it would be difficult.

artificial concept

cognitive psychology

cognitive script

event schema

natural concept

role schema

Answers to Exercises

1. Answers will vary. When attending a basketball game, it is typical to support your team by wearing the team colors and sitting behind their bench.

2. Event schemata are rooted in the social fabric of our communities. We expect people to behave in certain ways in certain types of situations, and we hold ourselves to the same social standards. It is uncomfortable to go against an event schema—it feels almost like we are breaking the rules.

artificial concept:  concept that is defined by a very specific set of characteristics

cognition:  thinking, including perception, learning, problem solving, judgment, and memory

cognitive psychology:  field of psychology dedicated to studying every aspect of how people think

concept:  category or grouping of linguistic information, objects, ideas, or life experiences

cognitive script:  set of behaviors that are performed the same way each time; also referred to as an event schema

event schema:  set of behaviors that are performed the same way each time; also referred to as a cognitive script

natural concept:  mental groupings that are created “naturally” through your experiences

prototype:  best representation of a concept

role schema:  set of expectations that define the behaviors of a person occupying a particular role

schema:  (plural = schemata) mental construct consisting of a cluster or collection of related concepts

Creative Commons License

Share This Book

  • Increase Font Size

Logo for Maricopa Open Digital Press

6 Thinking and Intelligence

Three side by side images are shown. On the left is a person lying in the grass with a book, looking off into the distance. In the middle is a sculpture of a person sitting on rock, with chin rested on hand, and the elbow of that hand rested on knee. The third is a drawing of a person sitting cross-legged with his head resting on his hand, elbow on knee.

What is the best way to solve a problem? How does a person who has never seen or touched snow in real life develop an understanding of the concept of snow? How do young children acquire the ability to learn language with no formal instruction? Psychologists who study thinking explore questions like these and are called cognitive psychologists.

Cognitive psychologists also study intelligence. What is intelligence, and how does it vary from person to person? Are “street smarts” a kind of intelligence, and if so, how do they relate to other types of intelligence? What does an IQ test really measure? These questions and more will be explored in this chapter as you study thinking and intelligence.

In other chapters, we discussed the cognitive processes of perception, learning, and memory. In this chapter, we will focus on high-level cognitive processes. As a part of this discussion, we will consider thinking and briefly explore the development and use of language. We will also discuss problem solving and creativity before ending with a discussion of how intelligence is measured and how our biology and environments interact to affect intelligence. After finishing this chapter, you will have a greater appreciation of the higher-level cognitive processes that contribute to our distinctiveness as a species.

Learning Objectives

By the end of this section, you will be able to:

  • Describe cognition
  • Distinguish concepts and prototypes
  • Explain the difference between natural and artificial concepts
  • Describe how schemata are organized and constructed

Imagine all of your thoughts as if they were physical entities, swirling rapidly inside your mind. How is it possible that the brain is able to move from one thought to the next in an organized, orderly fashion? The brain is endlessly perceiving, processing, planning, organizing, and remembering—it is always active. Yet, you don’t notice most of your brain’s activity as you move throughout your daily routine. This is only one facet of the complex processes involved in cognition. Simply put,  cognition  is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory. Scientists who study cognition are searching for ways to understand how we integrate, organize, and utilize our conscious cognitive experiences without being aware of all of the unconscious work that our brains are doing (for example, Kahneman, 2011).

Upon waking each morning, you begin thinking—contemplating the tasks that you must complete that day. In what order should you run your errands? Should you go to the bank, the cleaners, or the grocery store first? Can you get these things done before you head to class or will they need to wait until school is done? These thoughts are one example of cognition at work. Exceptionally complex, cognition is an essential feature of human consciousness, yet not all aspects of cognition are consciously experienced.

Cognitive psychology  is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes. Cognitive psychologists strive to determine and measure different types of intelligence, why some people are better at problem solving than others, and how emotional intelligence affects success in the workplace, among countless other topics. They also sometimes focus on how we organize thoughts and information gathered from our environments into meaningful categories of thought, which will be discussed later.

Concepts and Prototypes

The human nervous system is capable of handling endless streams of information. The senses serve as the interface between the mind and the external environment, receiving stimuli and translating it into nerve impulses that are transmitted to the brain. The brain then processes this information and uses the relevant pieces to create thoughts, which can then be expressed through language or stored in memory for future use. To make this process more complex, the brain does not gather information from external environments only. When thoughts are formed, the mind synthesizes information from emotions and memories ( Figure 7.2 ). Emotion and memory are powerful influences on both our thoughts and behaviors.

The outline of a human head is shown. There is a box containing “Information, sensations” in front of the head. An arrow from this box points to another box containing “Emotions, memories” located where the front of the person's brain would be. An arrow from this second box points to a third box containing “Thoughts” located where the back of the person's brain would be. There are two arrows coming from “Thoughts.” One arrow points back to the second box, “Emotions, memories,” and the other arrow points to a fourth box, “Behavior.”

In order to organize this staggering amount of information, the mind has developed a “file cabinet” of sorts in the mind. The different files stored in the file cabinet are called concepts.  Concepts  are categories or groupings of linguistic information, images, ideas, or memories, such as life experiences. Concepts are, in many ways, big ideas that are generated by observing details, and categorizing and combining these details into cognitive structures. You use concepts to see the relationships among the different elements of your experiences and to keep the information in your mind organized and accessible.

Concepts are informed by our semantic memory (you will learn more about semantic memory in a later chapter) and are present in every aspect of our lives; however, one of the easiest places to notice concepts is inside a classroom, where they are discussed explicitly. When you study United States history, for example, you learn about more than just individual events that have happened in America’s past. You absorb a large quantity of information by listening to and participating in discussions, examining maps, and reading first-hand accounts of people’s lives. Your brain analyzes these details and develops an overall understanding of American history. In the process, your brain gathers details that inform and refine your understanding of related concepts like democracy, power, and freedom.

Concepts can be complex and abstract, like justice, or more concrete, like types of birds. In psychology, for example, Piaget’s stages of development are abstract concepts. Some concepts, like tolerance, are agreed upon by many people because they have been used in various ways over many years. Other concepts, like the characteristics of your ideal friend or your family’s birthday traditions, are personal and individualized. In this way, concepts touch every aspect of our lives, from our many daily routines to the guiding principles behind the way governments function.

Another technique used by your brain to organize information is the identification of prototypes for the concepts you have developed. A  prototype  is the best example or representation of a concept. For example, what comes to your mind when you think of a dog? Most likely your early experiences with dogs will shape what you imagine. If your first pet was a Golden Retriever, there is a good chance that this would be your prototype for the category of dogs.

Natural and Artificial Concepts

In psychology, concepts can be divided into two categories, natural and artificial.  Natural concepts  are created “naturally” through your experiences and can be developed from either direct or indirect experiences. For example, if you live in Essex Junction, Vermont, you have probably had a lot of direct experience with snow. You’ve watched it fall from the sky, you’ve seen lightly falling snow that barely covers the windshield of your car, and you’ve shoveled out 18 inches of fluffy white snow as you’ve thought, “This is perfect for skiing.” You’ve thrown snowballs at your best friend and gone sledding down the steepest hill in town. In short, you know snow. You know what it looks like, smells like, tastes like, and feels like. If, however, you’ve lived your whole life on the island of Saint Vincent in the Caribbean, you may never have actually seen snow, much less tasted, smelled, or touched it. You know snow from the indirect experience of seeing pictures of falling snow—or from watching films that feature snow as part of the setting. Either way, snow is a natural concept because you can construct an understanding of it through direct observations, experiences with snow, or indirect knowledge (such as from films or books) ( Figure 7.3 ).

Photograph A shows a snow covered landscape with the sun shining over it. Photograph B shows a sphere shaped object perched atop the corner of a cube shaped object. There is also a triangular object shown.

An  artificial concept , on the other hand, is a concept that is defined by a specific set of characteristics. Various properties of geometric shapes, like squares and triangles, serve as useful examples of artificial concepts. A triangle always has three angles and three sides. A square always has four equal sides and four right angles. Mathematical formulas, like the equation for area (length × width), are artificial concepts defined by specific sets of characteristics that are always the same. Artificial concepts can enhance the understanding of a topic by building on one another. For example, before learning the concept of “area of a square” (and the formula to find it), you must understand what a square is. Once the concept of “area of a square” is understood, an understanding of area for other geometric shapes can be built upon the original understanding of area. The use of artificial concepts to define an idea is crucial to communicating with others and engaging in complex thought. According to Goldstone and Kersten (2003), concepts act as building blocks and can be connected in countless combinations to create complex thoughts.

A  schema  is a mental construct consisting of a cluster or collection of related concepts (Bartlett, 1932). There are many different types of schemata, and they all have one thing in common: schemata are a method of organizing information that allows the brain to work more efficiently. When a schema is activated, the brain makes immediate assumptions about the person or object being observed.

There are several types of schemata. A  role schema  makes assumptions about how individuals in certain roles will behave (Callero, 1994). For example, imagine you meet someone who introduces himself as a firefighter. When this happens, your brain automatically activates the “firefighter schema” and begins making assumptions that this person is brave, selfless, and community-oriented. Despite not knowing this person, already you have unknowingly made judgments about him. Schemata also help you fill in gaps in the information you receive from the world around you. While schemata allow for more efficient information processing, there can be problems with schemata, regardless of whether they are accurate: Perhaps this particular firefighter is not brave, he just works as a firefighter to pay the bills while studying to become a children’s librarian.

An  event schema , also known as a  cognitive script , is a set of behaviors that can feel like a routine. Think about what you do when you walk into an elevator ( Figure 7.4 ). First, the doors open and you wait to let exiting passengers leave the elevator car. Then, you step into the elevator and turn around to face the doors, looking for the correct button to push. You never face the back of the elevator, do you? And when you’re riding in a crowded elevator and you can’t face the front, it feels uncomfortable, doesn’t it? Interestingly, event schemata can vary widely among different cultures and countries. For example, while it is quite common for people to greet one another with a handshake in the United States, in Tibet, you greet someone by sticking your tongue out at them, and in Belize, you bump fists (Cairns Regional Council, n.d.)

A crowded elevator is shown. There are many people standing close to one another.

Because event schemata are automatic, they can be difficult to change. Imagine that you are driving home from work or school. This event schema involves getting in the car, shutting the door, and buckling your seatbelt before putting the key in the ignition. You might perform this script two or three times each day. As you drive home, you hear your phone’s ring tone. Typically, the event schema that occurs when you hear your phone ringing involves locating the phone and answering it or responding to your latest text message. So without thinking, you reach for your phone, which could be in your pocket, in your bag, or on the passenger seat of the car. This powerful event schema is informed by your pattern of behavior and the pleasurable stimulation that a phone call or text message gives your brain. Because it is a schema, it is extremely challenging for us to stop reaching for the phone, even though we know that we endanger our own lives and the lives of others while we do it (Neyfakh, 2013) ( Figure 7.5 ).

A person’s right hand is holding a cellular phone. The person is in the driver’s seat of an automobile while on the road.

Remember the elevator? It feels almost impossible to walk in and  not  face the door. Our powerful event schema dictates our behavior in the elevator, and it is no different with our phones. Current research suggests that it is the habit, or event schema, of checking our phones in many different situations that make refraining from checking them while driving especially difficult (Bayer & Campbell, 2012). Because texting and driving has become a dangerous epidemic in recent years, psychologists are looking at ways to help people interrupt the “phone schema” while driving. Event schemata like these are the reason why many habits are difficult to break once they have been acquired. As we continue to examine thinking, keep in mind how powerful the forces of concepts and schemata are to our understanding of the world.

  • Define language and demonstrate familiarity with the components of language
  • Understand the development of language
  • Explain the relationship between language and thinking

Language  is a communication system that involves using words and systematic rules to organize those words to transmit information from one individual to another. While language is a form of communication, not all communication is language. Many species communicate with one another through their postures, movements, odors, or vocalizations. This communication is crucial for species that need to interact and develop social relationships with their conspecifics. However, many people have asserted that it is language that makes humans unique among all of the animal species (Corballis & Suddendorf, 2007; Tomasello & Rakoczy, 2003). This section will focus on what distinguishes language as a special form of communication, how the use of language develops, and how language affects the way we think.

Components of Language

Language, be it spoken, signed, or written, has specific components: a lexicon and grammar.  Lexicon  refers to the words of a given language. Thus, lexicon is a language’s vocabulary.  Grammar  refers to the set of rules that are used to convey meaning through the use of the lexicon (Fernández & Cairns, 2011). For instance, English grammar dictates that most verbs receive an “-ed” at the end to indicate past tense.

Words are formed by combining the various phonemes that make up the language. A  phoneme  (e.g., the sounds “ah” vs. “eh”) is a basic sound unit of a given language, and different languages have different sets of phonemes. Phonemes are combined to form  morphemes , which are the smallest units of language that convey some type of meaning (e.g., “I” is both a phoneme and a morpheme). We use semantics and syntax to construct language. Semantics and syntax are part of a language’s grammar.  Semantics  refers to the process by which we derive meaning from morphemes and words.  Syntax  refers to the way words are organized into sentences (Chomsky, 1965; Fernández & Cairns, 2011).

We apply the rules of grammar to organize the lexicon in novel and creative ways, which allow us to communicate information about both concrete and abstract concepts. We can talk about our immediate and observable surroundings as well as the surface of unseen planets. We can share our innermost thoughts, our plans for the future, and debate the value of a college education. We can provide detailed instructions for cooking a meal, fixing a car, or building a fire. Through our use of words and language, we are able to form, organize, and express ideas, schema, and artificial concepts.

Language Development

Given the remarkable complexity of a language, one might expect that mastering a language would be an especially arduous task; indeed, for those of us trying to learn a second language as adults, this might seem to be true. However, young children master language very quickly with relative ease. B. F.  Skinner  (1957) proposed that language is learned through reinforcement. Noam  Chomsky  (1965) criticized this behaviorist approach, asserting instead that the mechanisms underlying language acquisition are biologically determined. The use of language develops in the absence of formal instruction and appears to follow a very similar pattern in children from vastly different cultures and backgrounds. It would seem, therefore, that we are born with a biological predisposition to acquire a language (Chomsky, 1965; Fernández & Cairns, 2011). Moreover, it appears that there is a critical period for language acquisition, such that this proficiency at acquiring language is maximal early in life; generally, as people age, the ease with which they acquire and master new languages diminishes (Johnson & Newport, 1989; Lenneberg, 1967; Singleton, 1995).

Children begin to learn about language from a very early age ( Table 7.1 ). In fact, it appears that this is occurring even before we are born. Newborns show a preference for their mother’s voice and appear to be able to discriminate between the language spoken by their mother and other languages. Babies are also attuned to the languages being used around them and show preferences for videos of faces that are moving in synchrony with the audio of spoken language versus videos that do not synchronize with the audio (Blossom & Morgan, 2006; Pickens, 1994; Spelke & Cortelyou, 1981).

DIG DEEPER: The Case of Genie

In the fall of 1970, a social worker in the Los Angeles area found a 13-year-old girl who was being raised in extremely neglectful and abusive conditions. The girl, who came to be known as Genie, had lived most of her life tied to a potty chair or confined to a crib in a small room that was kept closed with the curtains drawn. For a little over a decade, Genie had virtually no social interaction and no access to the outside world. As a result of these conditions, Genie was unable to stand up, chew solid food, or speak (Fromkin, Krashen, Curtiss, Rigler, & Rigler, 1974; Rymer, 1993). The police took Genie into protective custody.

Genie’s abilities improved dramatically following her removal from her abusive environment, and early on, it appeared she was acquiring language—much later than would be predicted by critical period hypotheses that had been posited at the time (Fromkin et al., 1974). Genie managed to amass an impressive vocabulary in a relatively short amount of time. However, she never developed a mastery of the grammatical aspects of language (Curtiss, 1981). Perhaps being deprived of the opportunity to learn language during a critical period impeded Genie’s ability to fully acquire and use language.

You may recall that each language has its own set of phonemes that are used to generate morphemes, words, and so on. Babies can discriminate among the sounds that make up a language (for example, they can tell the difference between the “s” in vision and the “ss” in fission); early on, they can differentiate between the sounds of all human languages, even those that do not occur in the languages that are used in their environments. However, by the time that they are about 1 year old, they can only discriminate among those phonemes that are used in the language or languages in their environments (Jensen, 2011; Werker & Lalonde, 1988; Werker & Tees, 1984).

After the first few months of life, babies enter what is known as the babbling stage, during which time they tend to produce single syllables that are repeated over and over. As time passes, more variations appear in the syllables that they produce. During this time, it is unlikely that the babies are trying to communicate; they are just as likely to babble when they are alone as when they are with their caregivers (Fernández & Cairns, 2011). Interestingly, babies who are raised in environments in which sign language is used will also begin to show babbling in the gestures of their hands during this stage (Petitto, Holowka, Sergio, Levy, & Ostry, 2004).

Generally, a child’s first word is uttered sometime between the ages of 1 year to 18 months, and for the next few months, the child will remain in the “one word” stage of language development. During this time, children know a number of words, but they only produce one-word utterances. The child’s early vocabulary is limited to familiar objects or events, often nouns. Although children in this stage only make one-word utterances, these words often carry larger meaning (Fernández & Cairns, 2011). So, for example, a child saying “cookie” could be identifying a cookie or asking for a cookie.

As a child’s lexicon grows, she begins to utter simple sentences and to acquire new vocabulary at a very rapid pace. In addition, children begin to demonstrate a clear understanding of the specific rules that apply to their language(s). Even the mistakes that children sometimes make provide evidence of just how much they understand about those rules. This is sometimes seen in the form of  overgeneralization . In this context, overgeneralization refers to an extension of a language rule to an exception to the rule. For example, in English, it is usually the case that an “s” is added to the end of a word to indicate plurality. For example, we speak of one dog versus two dogs. Young children will overgeneralize this rule to cases that are exceptions to the “add an s to the end of the word” rule and say things like “those two gooses” or “three mouses.” Clearly, the rules of the language are understood, even if the exceptions to the rules are still being learned (Moskowitz, 1978).

Language and Thought

When we speak one language, we agree that words are representations of ideas, people, places, and events. The given language that children learn is connected to their culture and surroundings. But can words themselves shape the way we think about things? Psychologists have long investigated the question of whether language shapes thoughts and actions, or whether our thoughts and beliefs shape our language. Two researchers, Edward Sapir and Benjamin Lee Whorf began this investigation in the 1940s. They wanted to understand how the language habits of a community encourage members of that community to interpret language in a particular manner (Sapir, 1941/1964). Sapir and Whorf proposed that language determines thought. For example, in some languages, there are many different words for love. However, in English, we use the word love for all types of love. Does this affect how we think about love depending on the language that we speak (Whorf, 1956)? Researchers have since identified this view as too absolute, pointing out a lack of empiricism behind what Sapir and Whorf proposed (Abler, 2013; Boroditsky, 2011; van Troyer, 1994). Today, psychologists continue to study and debate the relationship between language and thought.

  • Describe problem solving strategies
  • Define algorithm and heuristic
  • Explain some common roadblocks to effective problem solving and decision making

People face problems every day—usually, multiple problems throughout the day. Sometimes these problems are straightforward: To double a recipe for pizza dough, for example, all that is required is that each ingredient in the recipe is doubled. Sometimes, however, the problems we encounter are more complex. For example, say you have a work deadline, and you must mail a printed copy of a report to your supervisor by the end of the business day. The report is time-sensitive and must be sent overnight. You finished the report last night, but your printer will not work today. What should you do? First, you need to identify the problem and then apply a strategy for solving the problem.

Problem-Solving Strategies

When you are presented with a problem—whether it is a complex mathematical problem or a broken printer, how do you solve it? Before finding a solution to the problem, the problem must first be clearly identified. After that, one of many problem solving strategies can be applied, hopefully resulting in a solution.

A  problem-solving strategy  is a plan of action used to find a solution. Different strategies have different action plans associated with them ( Table 7.2 ). For example, a well-known strategy is  trial and error . The old adage, “If at first, you don’t succeed, try, try again” describes trial and error. In terms of your broken printer, you could try checking the ink levels, and if that doesn’t work, you could check to make sure the paper tray isn’t jammed. Or maybe the printer isn’t actually connected to your laptop. When using trial and error, you would continue to try different solutions until you solved your problem. Although trial and error is not typically one of the most time-efficient strategies, it is a commonly used one.

Another type of strategy is an algorithm. An  algorithm  is a problem-solving formula that provides you with step-by-step instructions used to achieve a desired outcome (Kahneman, 2011). You can think of an algorithm as a recipe with highly detailed instructions that produce the same result every time they are performed. Algorithms are used frequently in our everyday lives, especially in computer science. When you run a search on the Internet, search engines like Google use algorithms to decide which entries will appear first in your list of results. Facebook also uses algorithms to decide which posts to display on your newsfeed. Can you identify other situations in which algorithms are used?

A heuristic is another type of problem solving strategy. While an algorithm must be followed exactly to produce a correct result, a  heuristic is a general problem-solving framework (Tversky & Kahneman, 1974). You can think of these as mental shortcuts that are used to solve problems. A “rule of thumb” is an example of a heuristic. Such a rule saves the person time and energy when making a decision, but despite its time-saving characteristics, it is not always the best method for making a rational decision. Different types of heuristics are used in different types of situations, but the impulse to use a heuristic occurs when one of the five conditions is met (Pratkanis, 1989):

  • When one is faced with too much information
  • When the time to make a decision is limited
  • When the decision to be made is unimportant
  • When there is access to very little information to use in making the decision
  • When an appropriate heuristic happens to come to mind in the same moment

Working backward is a useful heuristic in which you begin solving the problem by focusing on the end result. Consider this example: You live in Washington, D.C., and have been invited to a wedding at 4 PM on Saturday in Philadelphia. Knowing that Interstate 95 tends to back up any day of the week, you need to plan your route and time your departure accordingly. If you want to be at the wedding service by 3:30 PM, and it takes 2.5 hours to get to Philadelphia without traffic, what time should you leave your house? You use the working backward heuristic to plan the events of your day on a regular basis, probably without even thinking about it.

Another useful heuristic is the practice of accomplishing a large goal or task by breaking it into a series of smaller steps. Students often use this common method to complete a large research project or a long essay for school. For example, students typically brainstorm, develop a thesis or main topic, research the chosen topic, organize their information into an outline, write a rough draft, revise and edit the rough draft, develop a final draft, organize the references list, and proofread their work before turning in the project. The large task becomes less overwhelming when it is broken down into a series of small steps.

EVERYDAY CONNECTION: Solving Puzzles

Problem-solving abilities can improve with practice. Many people challenge themselves every day with puzzles and other mental exercises to sharpen their problem-solving skills. Sudoku puzzles appear daily in most newspapers. Typically, a sudoku puzzle is a 9×9 grid. The simple sudoku below ( Figure 7.7 ) is a 4×4 grid. To solve the puzzle, fill in the empty boxes with a single digit: 1, 2, 3, or 4. Here are the rules: The numbers must total 10 in each bolded box, each row, and each column; however, each digit can only appear once in a bolded box, row, and column. Time yourself as you solve this puzzle and compare your time with a classmate.

A four column by four row Sudoku puzzle is shown. The top left cell contains the number 3. The top right cell contains the number 2. The bottom right cell contains the number 1. The bottom left cell contains the number 4. The cell at the intersection of the second row and the second column contains the number 4. The cell to the right of that contains the number 1. The cell below the cell containing the number 1 contains the number 2. The cell to the left of the cell containing the number 2 contains the number 3.

Here is another popular type of puzzle ( Figure 7.8 ) that challenges your spatial reasoning skills. Connect all nine dots with four connecting straight lines without lifting your pencil from the paper:

A square shaped outline contains three rows and three columns of dots with equal space between them.

Take a look at the “Puzzling Scales” logic puzzle below ( Figure 7.9 ). Sam Loyd, a well-known puzzle master, created and refined countless puzzles throughout his lifetime (Cyclopedia of Puzzles, n.d.).

A puzzle involving a scale is shown. At the top of the figure it reads: “Sam Loyds Puzzling Scales.” The first row of the puzzle shows a balanced scale with 3 blocks and a top on the left and 12 marbles on the right. Below this row it reads: “Since the scales now balance.” The next row of the puzzle shows a balanced scale with just the top on the left, and 1 block and 8 marbles on the right. Below this row it reads: “And balance when arranged this way.” The third row shows an unbalanced scale with the top on the left side, which is much lower than the right side. The right side is empty. Below this row it reads: “Then how many marbles will it require to balance with that top?”

Not all problems are successfully solved, however. What challenges stop us from successfully solving a problem? Albert Einstein once said, “Insanity is doing the same thing over and over again and expecting a different result.” Imagine a person in a room that has four doorways. One doorway that has always been open in the past is now locked. The person, accustomed to exiting the room by that particular doorway, keeps trying to get out through the same doorway even though the other three doorways are open. The person is stuck—but she just needs to go to another doorway, instead of trying to get out through the locked doorway. A  mental set  is where you persist in approaching a problem in a way that has worked in the past but is clearly not working now.

Functional fixedness  is a type of mental set where you cannot perceive an object being used for something other than what it was designed for. Duncker (1945) conducted foundational research on functional fixedness. He created an experiment in which participants were given a candle, a book of matches, and a box of thumbtacks. They were instructed to use those items to attach the candle to the wall so that it did not drip wax onto the table below. Participants had to use functional fixedness to solve the problem ( Figure 7.10 ). During the  Apollo 13  mission to the moon, NASA engineers at Mission Control had to overcome functional fixedness to save the lives of the astronauts aboard the spacecraft. An explosion in a module of the spacecraft damaged multiple systems. The astronauts were in danger of being poisoned by rising levels of carbon dioxide because of problems with the carbon dioxide filters. The engineers found a way for the astronauts to use spare plastic bags, tape, and air hoses to create a makeshift air filter, which saved the lives of the astronauts.

Figure a shows a book of matches, a box of thumbtacks, and a candle. Figure b shows the candle standing in the box that held the thumbtacks. A thumbtack attaches the box holding the candle to the wall.

Researchers have investigated whether functional fixedness is affected by culture. In one experiment, individuals from the Shuar group in Ecuador were asked to use an object for a purpose other than that for which the object was originally intended. For example, the participants were told a story about a bear and a rabbit that were separated by a river and asked to select among various objects, including a spoon, a cup, erasers, and so on, to help the animals. The spoon was the only object long enough to span the imaginary river, but if the spoon was presented in a way that reflected its normal usage, it took participants longer to choose the spoon to solve the problem. (German & Barrett, 2005). The researchers wanted to know if exposure to highly specialized tools, as occurs with individuals in industrialized nations, affects their ability to transcend functional fixedness. It was determined that functional fixedness is experienced in both industrialized and nonindustrialized cultures (German & Barrett, 2005).

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. For example, let’s say you and three friends wanted to rent a house and had a combined target budget of $1,600. The realtor shows you only very run-down houses for $1,600 and then shows you a very nice house for $2,000. Might you ask each person to pay more in rent to get the $2,000 home? Why would the realtor show you the run-down houses and the nice house? The realtor may be challenging your anchoring bias. An  anchoring bias  occurs when you focus on one piece of information when making a decision or solving a problem. In this case, you’re so focused on the amount of money you are willing to spend that you may not recognize what kinds of houses are available at that price point.

The  confirmation bias  is the tendency to focus on information that confirms your existing beliefs. For example, if you think that your professor is not very nice, you notice all of the instances of rude behavior exhibited by the professor while ignoring the countless pleasant interactions he is involved in on a daily basis.  Hindsight bias  leads you to believe that the event you just experienced was predictable, even though it really wasn’t. In other words, you knew all along that things would turn out the way they did.  Representative bias describes a faulty way of thinking, in which you unintentionally stereotype someone or something; for example, you may assume that your professors spend their free time reading books and engaging in intellectual conversation because the idea of them spending their time playing volleyball or visiting an amusement park does not fit in with your stereotypes of professors.

Finally, the  availability heuristic  is a heuristic in which you make a decision based on an example, information, or recent experience that is that readily available to you, even though it may not be the best example to inform your decision .  Biases tend to “preserve that which is already established—to maintain our preexisting knowledge, beliefs, attitudes, and hypotheses” (Aronson, 1995; Kahneman, 2011). These biases are summarized in  Table 7.3 .

Were you able to determine how many marbles are needed to balance the scales in  Figure 7.9 ? You need nine. Were you able to solve the problems in  Figure 7.7  and  Figure 7.8 ? Here are the answers ( Figure 7.11 ).

The first puzzle is a Sudoku grid of 16 squares (4 rows of 4 squares) is shown. Half of the numbers were supplied to start the puzzle and are colored blue, and half have been filled in as the puzzle’s solution and are colored red. The numbers in each row of the grid, left to right, are as follows. Row 1: blue 3, red 1, red 4, blue 2. Row 2: red 2, blue 4, blue 1, red 3. Row 3: red 1, blue 3, blue 2, red 4. Row 4: blue 4, red 2, red 3, blue 1.The second puzzle consists of 9 dots arranged in 3 rows of 3 inside of a square. The solution, four straight lines made without lifting the pencil, is shown in a red line with arrows indicating the direction of movement. In order to solve the puzzle, the lines must extend beyond the borders of the box. The four connecting lines are drawn as follows. Line 1 begins at the top left dot, proceeds through the middle and right dots of the top row, and extends to the right beyond the border of the square. Line 2 extends from the end of line 1, through the right dot of the horizontally centered row, through the middle dot of the bottom row, and beyond the square’s border ending in the space beneath the left dot of the bottom row. Line 3 extends from the end of line 2 upwards through the left dots of the bottom, middle, and top rows. Line 4 extends from the end of line 3 through the middle dot in the middle row and ends at the right dot of the bottom row.

  • Define intelligence
  • Explain the triarchic theory of intelligence
  • Identify the difference between intelligence theories
  • Explain emotional intelligence
  • Define creativity

Classifying Intelligence

What exactly is intelligence? The way that researchers have defined the concept of intelligence has been modified many times since the birth of psychology. British psychologist Charles Spearman believed intelligence consisted of one general factor, called  g , which could be measured and compared among individuals. Spearman focused on the commonalities among various intellectual abilities and de-emphasized what made each unique. Long before modern psychology developed, however, ancient philosophers, such as Aristotle, held a similar view (Cianciolo & Sternberg, 2004).

Other psychologists believe that instead of a single factor, intelligence is a collection of distinct abilities. In the 1940s, Raymond Cattell proposed a theory of intelligence that divided general intelligence into two components: crystallized intelligence and fluid intelligence (Cattell, 1963). Crystallized intelligence  is characterized as acquired knowledge and the ability to retrieve it. When you learn, remember, and recall information, you are using crystallized intelligence. You use crystallized intelligence all the time in your coursework by demonstrating that you have mastered the information covered in the course.  Fluid intelligence  encompasses the ability to see complex relationships and solve problems. Navigating your way home after being detoured onto an unfamiliar route because of road construction would draw upon your fluid intelligence. Fluid intelligence helps you tackle complex, abstract challenges in your daily life, whereas crystallized intelligence helps you overcome concrete, straightforward problems (Cattell, 1963).

Other theorists and psychologists believe that intelligence should be defined in more practical terms. For example, what types of behaviors help you get ahead in life? Which skills promote success? Think about this for a moment. Being able to recite all 45 presidents of the United States in order is an excellent party trick, but will knowing this make you a better person?

Robert Sternberg developed another theory of intelligence, which he titled the  triarchic theory of intelligence  because it sees intelligence as comprised of three parts (Sternberg, 1988): practical, creative, and analytical intelligence ( Figure 7.12 ).

Three boxes are arranged in a triangle. The top box contains “Analytical intelligence; academic problem solving and computation.” There is a line with arrows on both ends connecting this box to another box containing “Practical intelligence; street smarts and common sense.” Another line with arrows on both ends connects this box to another box containing “Creative intelligence; imaginative and innovative problem solving.” Another line with arrows on both ends connects this box to the first box described, completing the triangle.

Practical intelligence , as proposed by Sternberg, is sometimes compared to “street smarts.” Being practical means you find solutions that work in your everyday life by applying knowledge based on your experiences. This type of intelligence appears to be separate from the traditional understanding of IQ; individuals who score high in practical intelligence may or may not have comparable scores in creative and analytical intelligence (Sternberg, 1988).

Analytical intelligence is closely aligned with academic problem solving and computations. Sternberg says that analytical intelligence is demonstrated by an ability to analyze, evaluate, judge, compare, and contrast. When reading a classic novel for a literature class, for example, it is usually necessary to compare the motives of the main characters of the book or analyze the historical context of the story. In a science course such as anatomy, you must study the processes by which the body uses various minerals in different human systems. In developing an understanding of this topic, you are using analytical intelligence. When solving a challenging math problem, you would apply analytical intelligence to analyze different aspects of the problem and then solve it section by section.

Creative intelligence  is marked by inventing or imagining a solution to a problem or situation. Creativity in this realm can include finding a novel solution to an unexpected problem or producing a beautiful work of art or a well-developed short story. Imagine for a moment that you are camping in the woods with some friends and realize that you’ve forgotten your camp coffee pot. The person in your group who figures out a way to successfully brew coffee for everyone would be credited as having higher creative intelligence.

Multiple Intelligences Theory  was developed by Howard Gardner, a Harvard psychologist and former student of Erik Erikson. Gardner’s theory, which has been refined for more than 30 years, is a more recent development among theories of intelligence. In Gardner’s theory, each person possesses at least eight intelligences. Among these eight intelligences, a person typically excels in some and falters in others (Gardner, 1983).  Table 7.4  describes each type of intelligence.

Gardner’s theory is relatively new and needs additional research to better establish empirical support. At the same time, his ideas challenge the traditional idea of intelligence to include a wider variety of abilities, although it has been suggested that Gardner simply relabeled what other theorists called “cognitive styles” as “intelligences” (Morgan, 1996). Furthermore, developing traditional measures of Gardner’s intelligences is extremely difficult (Furnham, 2009; Gardner & Moran, 2006; Klein, 1997).

Gardner’s inter- and intrapersonal intelligences are often combined into a single type: emotional intelligence.  Emotional intelligence  encompasses the ability to understand the emotions of yourself and others, show empathy, understand social relationships and cues, and regulate your own emotions and respond in culturally appropriate ways (Parker, Saklofske, & Stough, 2009). People with high emotional intelligence typically have well-developed social skills. Some researchers, including Daniel Goleman, the author of  Emotional Intelligence: Why It Can Matter More than IQ , argue that emotional intelligence is a better predictor of success than traditional intelligence (Goleman, 1995). However, emotional intelligence has been widely debated, with researchers pointing out inconsistencies in how it is defined and described, as well as questioning results of studies on a subject that is difficult to measure and study empirically (Locke, 2005; Mayer, Salovey, & Caruso, 2004)

The most comprehensive theory of intelligence to date is the Cattell-Horn-Carroll (CHC) theory of cognitive abilities (Schneider & McGrew, 2018). In this theory, abilities are related and arranged in a hierarchy with general abilities at the top, broad abilities in the middle, and narrow (specific) abilities at the bottom. The narrow abilities are the only ones that can be directly measured; however, they are integrated within the other abilities. At the general level is general intelligence. Next, the broad level consists of general abilities such as fluid reasoning, short-term memory, and processing speed. Finally, as the hierarchy continues, the narrow level includes specific forms of cognitive abilities. For example, short-term memory would further break down into memory span and working memory capacity.

Intelligence can also have different meanings and values in different cultures. If you live on a small island, where most people get their food by fishing from boats, it would be important to know how to fish and how to repair a boat. If you were an exceptional angler, your peers would probably consider you intelligent. If you were also skilled at repairing boats, your intelligence might be known across the whole island. Think about your own family’s culture. What values are important for Latinx families? Italian families? In Irish families, hospitality and telling an entertaining story are marks of the culture. If you are a skilled storyteller, other members of Irish culture are likely to consider you intelligent.

Some cultures place a high value on working together as a collective. In these cultures, the importance of the group supersedes the importance of individual achievement. When you visit such a culture, how well you relate to the values of that culture exemplifies your  cultural intelligence , sometimes referred to as cultural competence.

Creativity  is the ability to generate, create, or discover new ideas, solutions, and possibilities. Very creative people often have intense knowledge about something, work on it for years, look at novel solutions, seek out the advice and help of other experts, and take risks. Although creativity is often associated with the arts, it is actually a vital form of intelligence that drives people in many disciplines to discover something new. Creativity can be found in every area of life, from the way you decorate your residence to a new way of understanding how a cell works.

Creativity is often assessed as a function of one’s ability to engage in  divergent thinking . Divergent thinking can be described as thinking “outside the box;” it allows an individual to arrive at unique, multiple solutions to a given problem. In contrast,  convergent thinking describes the ability to provide a correct or well-established answer or solution to a problem (Cropley, 2006; Gilford, 1967)

  • Explain how intelligence tests are developed
  • Describe the history of the use of IQ tests
  • Describe the purposes and benefits of intelligence testing

While you’re likely familiar with the term “IQ” and associate it with the idea of intelligence, what does IQ really mean? IQ stands for  intelligence quotient  and describes a score earned on a test designed to measure intelligence. You’ve already learned that there are many ways psychologists describe intelligence (or more aptly, intelligences). Similarly, IQ tests—the tools designed to measure intelligence—have been the subject of debate throughout their development and use.

When might an IQ test be used? What do we learn from the results, and how might people use this information? While there are certainly many benefits to intelligence testing, it is important to also note the limitations and controversies surrounding these tests. For example, IQ tests have sometimes been used as arguments in support of insidious purposes, such as the eugenics movement (Severson, 2011). The infamous Supreme Court Case,  Buck v. Bell , legalized the forced sterilization of some people deemed “feeble-minded” through this type of testing, resulting in about 65,000 sterilizations ( Buck v. Bell , 274 U.S. 200; Ko, 2016). Today, only professionals trained in psychology can administer IQ tests, and the purchase of most tests requires an advanced degree in psychology. Other professionals in the field, such as social workers and psychiatrists, cannot administer IQ tests. In this section, we will explore what intelligence tests measure, how they are scored, and how they were developed.

Measuring Intelligence

It seems that the human understanding of intelligence is somewhat limited when we focus on traditional or academic-type intelligence. How then, can intelligence be measured? And when we measure intelligence, how do we ensure that we capture what we’re really trying to measure (in other words, that IQ tests function as valid measures of intelligence)? In the following paragraphs, we will explore the how intelligence tests were developed and the history of their use.

The IQ test has been synonymous with intelligence for over a century. In the late 1800s, Sir Francis Galton developed the first broad test of intelligence (Flanagan & Kaufman, 2004). Although he was not a psychologist, his contributions to the concepts of intelligence testing are still felt today (Gordon, 1995). Reliable intelligence testing (you may recall from earlier chapters that reliability refers to a test’s ability to produce consistent results) began in earnest during the early 1900s with a researcher named Alfred Binet ( Figure 7.13 ). Binet was asked by the French government to develop an intelligence test to use on children to determine which ones might have difficulty in school; it included many verbally based tasks. American researchers soon realized the value of such testing. Louis Terman, a Stanford professor, modified Binet’s work by standardizing the administration of the test and tested thousands of different-aged children to establish an average score for each age. As a result, the test was normed and standardized, which means that the test was administered consistently to a large enough representative sample of the population that the range of scores resulted in a bell curve (bell curves will be discussed later).  Standardization  means that the manner of administration, scoring, and interpretation of results is consistent.  Norming  involves giving a test to a large population so data can be collected comparing groups, such as age groups. The resulting data provide norms, or referential scores, by which to interpret future scores. Norms are not expectations of what a given group  should  know but a demonstration of what that group  does  know. Norming and standardizing the test ensures that new scores are reliable. This new version of the test was called the Stanford-Binet Intelligence Scale (Terman, 1916). Remarkably, an updated version of this test is still widely used today.

Photograph A shows a portrait of Alfred Binet. Photograph B shows six sketches of human faces. Above these faces is the label “Guide for Binet-Simon Scale. 223” The faces are arranged in three rows of two, and these rows are labeled “1, 2, and 3.” At the bottom it reads: “The psychological clinic is indebted for the loan of these cuts and those on p. 225 to the courtesy of Dr. Oliver P. Cornman, Associate Superintendent of Schools of Philadelphia, and Chairman of Committee on Backward Children Investigation. See Report of Committee, Dec. 31, 1910, appendix.”

In 1939, David Wechsler, a psychologist who spent part of his career working with World War I veterans, developed a new IQ test in the United States. Wechsler combined several subtests from other intelligence tests used between 1880 and World War I. These subtests tapped into a variety of verbal and nonverbal skills because Wechsler believed that intelligence encompassed “the global capacity of a person to act purposefully, to think rationally, and to deal effectively with his environment” (Wechsler, 1958, p. 7). He named the test the Wechsler-Bellevue Intelligence Scale (Wechsler, 1981). This combination of subtests became one of the most extensively used intelligence tests in the history of psychology. Although its name was later changed to the Wechsler Adult Intelligence Scale (WAIS) and has been revised several times, the aims of the test remain virtually unchanged since its inception (Boake, 2002). Today, there are three intelligence tests credited to Wechsler, the Wechsler Adult Intelligence Scale-fourth edition (WAIS-IV), the Wechsler Intelligence Scale for Children (WISC-V), and the Wechsler Preschool and Primary Scale of Intelligence—IV (WPPSI-IV) (Wechsler, 2012). These tests are used widely in schools and communities throughout the United States, and they are periodically normed and standardized as a means of recalibration. As a part of the recalibration process, the WISC-V was given to thousands of children across the country, and children taking the test today are compared with their same-age peers ( Figure 7.13 ).

The WISC-V is composed of 14 subtests, which comprise five indices, which then render an IQ score. The five indices are Verbal Comprehension, Visual Spatial, Fluid Reasoning, Working Memory, and Processing Speed. When the test is complete, individuals receive a score for each of the five indices and a Full Scale IQ score. The method of scoring reflects the understanding that intelligence is comprised of multiple abilities in several cognitive realms and focuses on the mental processes that the child used to arrive at his or her answers to each test item.

Interestingly, the periodic recalibrations have led to an interesting observation known as the Flynn effect. Named after James Flynn, who was among the first to describe this trend, the  Flynn effect  refers to the observation that each generation has a significantly higher IQ than the last. Flynn himself argues, however, that increased IQ scores do not necessarily mean that younger generations are more intelligent per se (Flynn, Shaughnessy, & Fulgham, 2012).

Ultimately, we are still left with the question of how valid intelligence tests are. Certainly, the most modern versions of these tests tap into more than verbal competencies, yet the specific skills that should be assessed in IQ testing, the degree to which any test can truly measure an individual’s intelligence, and the use of the results of IQ tests are still issues of debate (Gresham & Witt, 1997; Flynn, Shaughnessy, & Fulgham, 2012; Richardson, 2002; Schlinger, 2003).

The Bell Curve

The results of intelligence tests follow the bell curve, a graph in the general shape of a bell. When the bell curve is used in psychological testing, the graph demonstrates a normal distribution of a trait, in this case, intelligence, in the human population. Many human traits naturally follow the bell curve. For example, if you lined up all your female schoolmates according to height, it is likely that a large cluster of them would be the average height for an American woman: 5’4”–5’6”. This cluster would fall in the center of the bell curve, representing the average height for American women ( Figure 7.14 ). There would be fewer women who stand closer to 4’11”. The same would be true for women of above-average height: those who stand closer to 5’11”. The trick to finding a bell curve in nature is to use a large sample size. Without a large sample size, it is less likely that the bell curve will represent the wider population. A  representative sample  is a subset of the population that accurately represents the general population. If, for example, you measured the height of the women in your classroom only, you might not actually have a representative sample. Perhaps the women’s basketball team wanted to take this course together, and they are all in your class. Because basketball players tend to be taller than average, the women in your class may not be a good representative sample of the population of American women. But if your sample included all the women at your school, it is likely that their heights would form a natural bell curve.

A graph of a bell curve is labeled “Height of U.S. Women.” The x axis is labeled “Height” and the y axis is labeled “Frequency.” Between the heights of five feet tall and five feet and five inches tall, the frequency rises to a curved peak, then begins dropping off at the same rate until it hits five feet ten inches tall.

The same principles apply to intelligence test scores. Individuals earn a score called an intelligence quotient (IQ). Over the years, different types of IQ tests have evolved, but the way scores are interpreted remains the same. The average IQ score on an IQ test is 100. Standard deviations  describe how data are dispersed in a population and give context to large data sets. The bell curve uses the standard deviation to show how all scores are dispersed from the average score ( Figure 7.15 ). In modern IQ testing, one standard deviation is 15 points. So a score of 85 would be described as “one standard deviation below the mean.” How would you describe a score of 115 and a score of 70? Any IQ score that falls within one standard deviation above and below the mean (between 85 and 115) is considered average, and 68% of the population has IQ scores in this range. An IQ score of 130 or above is considered a superior level.

A graph of a bell curve is labeled “Intelligence Quotient Score.” The x axis is labeled “IQ,” and the y axis is labeled “Population.” Beginning at an IQ of 60, the population rises to a curved peak at an IQ of 100 and then drops off at the same rate ending near zero at an IQ of 140.

Only 2.2% of the population has an IQ score below 70 (American Psychological Association [APA], 2013). A score of 70 or below indicates significant cognitive delays. When these are combined with major deficits in adaptive functioning, a person is diagnosed with having an intellectual disability (American Association on Intellectual and Developmental Disabilities, 2013). Formerly known as mental retardation, the accepted term now is intellectual disability, and it has four subtypes: mild, moderate, severe, and profound ( Table 7.5 ).  The Diagnostic and Statistical Manual of Psychological Disorders  lists criteria for each subgroup (APA, 2013).

On the other end of the intelligence spectrum are those individuals whose IQs fall into the highest ranges. Consistent with the bell curve, about 2% of the population falls into this category. People are considered gifted if they have an IQ score of 130 or higher, or superior intelligence in a particular area. Long ago, popular belief suggested that people of high intelligence were maladjusted. This idea was disproven through a groundbreaking study of gifted children. In 1921, Lewis Terman began a longitudinal study of over 1500 children with IQs over 135 (Terman, 1925). His findings showed that these children became well-educated, successful adults who were, in fact, well-adjusted (Terman & Oden, 1947). Additionally, Terman’s study showed that the subjects were above average in physical build and attractiveness, dispelling an earlier popular notion that highly intelligent people were “weaklings.” Some people with very high IQs elect to join Mensa, an organization dedicated to identifying, researching, and fostering intelligence. Members must have an IQ score in the top 2% of the population, and they may be required to pass other exams in their application to join the group.

DIG DEEPER: What’s in a Name? 

In the past, individuals with IQ scores below 70 and significant adaptive and social functioning delays were diagnosed with mental retardation. When this diagnosis was first named, the title held no social stigma. In time, however, the degrading word “retard” sprang from this diagnostic term. “Retard” was frequently used as a taunt, especially among young people, until the words “mentally retarded” and “retard” became an insult. As such, the DSM-5 now labels this diagnosis as “intellectual disability.” Many states once had a Department of Mental Retardation to serve those diagnosed with such cognitive delays, but most have changed their name to the Department of Developmental Disabilities or something similar in language.

Erin Johnson’s younger brother Matthew has Down syndrome. She wrote this piece about what her brother taught her about the meaning of intelligence:

His whole life, learning has been hard. Entirely possible – just different. He has always excelled with technology – typing his thoughts was more effective than writing them or speaking them. Nothing says “leave me alone” quite like a text that reads, “Do Not Call Me Right Now.” He is fully capable of reading books up to about a third-grade level, but he didn’t love it and used to always ask others to read to him. That all changed when his nephew came along, because he willingly reads to him, and it is the most heart-swelling, smile-inducing experience I have ever had the pleasure of witnessing.

When it comes down to it, Matt can learn. He does learn. It just takes longer, and he has to work harder for it, which if we’re being honest, is not a lot of fun. He is extremely gifted in learning things he takes an interest in, and those things often seem a bit “strange” to others. But no matter. It just proves my point – he  can  learn. That does not mean he will learn at the same pace, or even to the same level. It also, unfortunately, does not mean he will be allotted the same opportunities to learn as many others.

Here’s the scoop. We are all wired with innate abilities to retain and apply our learning and natural curiosities and passions that fuel our desire to learn. But our abilities and curiosities may not be the same.

The world doesn’t work this way though, especially not for my brother and his counterparts. Have him read aloud a book about skunks, and you may not get a whole lot from him. But have him tell you about skunks straight out of his memory, and hold onto your hats. He can hack the school’s iPad system, but he can’t tell you how he did it. He can write out every direction for a drive to our grandparents’ home in Florida, but he can’t drive.

Society is quick to deem him disabled and use demeaning language like the r-word to describe him, but in reality, we haven’t necessarily given him opportunities to showcase the learning he can do. In my case, I can escape the need to memorize how to change the oil in my car without anyone assuming I can’t do it, or calling me names when they find out I can’t. But Matthew can’t get through a day at his job without someone assuming he needs help. He is bright. Brighter than most anyone would assume. Maybe we need to redefine what is smart.

My brother doesn’t fit in the narrow schema of intelligence that is accepted in our society. But intelligence is far more than being able to solve 525 x 62 or properly introduce yourself to another. Why can’t we assume the intelligence of someone who can recite all of a character’s lines in a movie or remember my birthday a year after I told him/her a single time? Why is it we allow a person’s diagnosis or appearance to make us not just wonder if, but entirely doubt that they are capable? Maybe we need to cut away the sides of the box we have created for people so everyone can fit.

My brother can learn. It may not be what you know. It may be knowledge you would deem unimportant. It may not follow a traditional learning trajectory. But the fact remains – he can learn. Everyone can learn. And even though it is harder for him and harder for others still, he is not a “retard.” Nobody is.

When you use the r-word, you are insinuating that an individual, whether someone with a disability or not, is unintelligent, foolish, and purposeless. This in turn tells a person with a disability that they too are unintelligent, foolish, and purposeless. Because the word was historically used to describe individuals with disabilities and twisted from its original meaning to fit a cruel new context, it is forevermore associated with people like my brother. No matter how a person looks or learns or behaves, the r-word is never a fitting term. It’s time we waved it goodbye.

Why Measure Intelligence?

The value of IQ testing is most evident in educational or clinical settings. Children who seem to be experiencing learning difficulties or severe behavioral problems can be tested to ascertain whether the child’s difficulties can be partly attributed to an IQ score that is significantly different from the mean for her age group. Without IQ testing—or another measure of intelligence—children and adults needing extra support might not be identified effectively. In addition, IQ testing is used in courts to determine whether a defendant has special or extenuating circumstances that preclude him from participating in some way in a trial. People also use IQ testing results to seek disability benefits from the Social Security Administration.

  • Describe how genetics and environment affect intelligence
  • Explain the relationship between IQ scores and socioeconomic status
  • Describe the difference between a learning disability and a developmental disorder

High Intelligence: Nature or Nurture?

Where does high intelligence come from? Some researchers believe that intelligence is a trait inherited from a person’s parents. Scientists who research this topic typically use twin studies to determine the  heritability  of intelligence. The Minnesota Study of Twins Reared Apart is one of the most well-known twin studies. In this investigation, researchers found that identical twins raised together and identical twins raised apart exhibit a higher correlation between their IQ scores than siblings or fraternal twins raised together (Bouchard, Lykken, McGue, Segal, & Tellegen, 1990). The findings from this study reveal a genetic component to intelligence ( Figure 7.15 ). At the same time, other psychologists believe that intelligence is shaped by a child’s developmental environment. If parents were to provide their children with intellectual stimuli from before they are born, it is likely that they would absorb the benefits of that stimulation, and it would be reflected in intelligence levels.

A chart shows correlations of IQs for people of varying relationships. The bottom is labeled “Percent IQ Correlation” and the left side is labeled “Relationship.” The percent IQ Correlation for relationships where no genes are shared, including adoptive parent-child pairs, similarly aged unrelated children raised together, and adoptive siblings are around 21 percent, 30 percent, and 32 percent, respectively. The percent IQ Correlation for relationships where 25 percent of genes are shared, as in half-siblings, is around 33 percent. The percent IQ Correlation for relationships where 50 percent of genes are shared, including parent-children pairs, and fraternal twins raised together, are roughly 44 percent and 62 percent, respectively. A relationship where 100 percent of genes are shared, as in identical twins raised apart, results in a nearly 80 percent IQ correlation.

The reality is that aspects of each idea are probably correct. In fact, one study suggests that although genetics seem to be in control of the level of intelligence, the environmental influences provide both stability and change to trigger manifestation of cognitive abilities (Bartels, Rietveld, Van Baal, & Boomsma, 2002). Certainly, there are behaviors that support the development of intelligence, but the genetic component of high intelligence should not be ignored. As with all heritable traits, however, it is not always possible to isolate how and when high intelligence is passed on to the next generation.

Range of Reaction  is the theory that each person responds to the environment in a unique way based on his or her genetic makeup. According to this idea, your genetic potential is a fixed quantity, but whether you reach your full intellectual potential is dependent upon the environmental stimulation you experience, especially in childhood. Think about this scenario: A couple adopts a child who has average genetic intellectual potential. They raise her in an extremely stimulating environment. What will happen to the couple’s new daughter? It is likely that the stimulating environment will improve her intellectual outcomes over the course of her life. But what happens if this experiment is reversed? If a child with an extremely strong genetic background is placed in an environment that does not stimulate him: What happens? Interestingly, according to a longitudinal study of highly gifted individuals, it was found that “the two extremes of optimal and pathological experience are both represented disproportionately in the backgrounds of creative individuals”; however, those who experienced supportive family environments were more likely to report being happy (Csikszentmihalyi & Csikszentmihalyi, 1993, p. 187).

Another challenge to determining the origins of high intelligence is the confounding nature of our human social structures. It is troubling to note that some ethnic groups perform better on IQ tests than others—and it is likely that the results do not have much to do with the quality of each ethnic group’s intellect. The same is true for socioeconomic status. Children who live in poverty experience more pervasive, daily stress than children who do not worry about the basic needs of safety, shelter, and food. These worries can negatively affect how the brain functions and develops, causing a dip in IQ scores. Mark Kishiyama and his colleagues determined that children living in poverty demonstrated reduced prefrontal brain functioning comparable to children with damage to the lateral prefrontal cortex (Kishyama, Boyce, Jimenez, Perry, & Knight, 2009).

The debate around the foundations and influences on intelligence exploded in 1969 when an educational psychologist named Arthur Jensen published the article “How Much Can We Boost I.Q. and Achievement” in the Harvard Educational Review . Jensen had administered IQ tests to diverse groups of students, and his results led him to the conclusion that IQ is determined by genetics. He also posited that intelligence was made up of two types of abilities: Level I and Level II. In his theory, Level I is responsible for rote memorization, whereas Level II is responsible for conceptual and analytical abilities. According to his findings, Level I remained consistent among the human race. Level II, however, exhibited differences among ethnic groups (Modgil & Routledge, 1987). Jensen’s most controversial conclusion was that Level II intelligence is prevalent among Asians, then Caucasians, then African Americans. Robert Williams was among those who called out racial bias in Jensen’s results (Williams, 1970).

Obviously, Jensen’s interpretation of his own data caused an intense response in a nation that continued to grapple with the effects of racism (Fox, 2012). However, Jensen’s ideas were not solitary or unique; rather, they represented one of many examples of psychologists asserting racial differences in IQ and cognitive ability. In fact, Rushton and Jensen (2005) reviewed three decades worth of research on the relationship between race and cognitive ability. Jensen’s belief in the inherited nature of intelligence and the validity of the IQ test to be the truest measure of intelligence are at the core of his conclusions. If, however, you believe that intelligence is more than Levels I and II, or that IQ tests do not control for socioeconomic and cultural differences among people, then perhaps you can dismiss Jensen’s conclusions as a single window that looks out on the complicated and varied landscape of human intelligence.

In a related story, parents of African American students filed a case against the State of California in 1979, because they believed that the testing method used to identify students with learning disabilities was culturally unfair as the tests were normed and standardized using white children ( Larry P. v. Riles ). The testing method used by the state disproportionately identified African American children as mentally retarded. This resulted in many students being incorrectly classified as “mentally retarded.”

What are Learning Disabilities?

Learning disabilities are cognitive disorders that affect different areas of cognition, particularly language or reading. It should be pointed out that learning disabilities are not the same thing as intellectual disabilities. Learning disabilities are considered specific neurological impairments rather than global intellectual or developmental disabilities. A person with a language disability has difficulty understanding or using spoken language, whereas someone with a reading disability, such as dyslexia, has difficulty processing what he or she is reading.

Often, learning disabilities are not recognized until a child reaches school age. One confounding aspect of learning disabilities is that they most often affect children with average to above-average intelligence. In other words, the disability is specific to a particular area and not a measure of overall intellectual ability. At the same time, learning disabilities tend to exhibit comorbidity with other disorders, like attention-deficit hyperactivity disorder (ADHD). Anywhere between 30–70% of individuals with diagnosed cases of ADHD also have some sort of learning disability (Riccio, Gonzales, & Hynd, 1994). Let’s take a look at three examples of common learning disabilities: dysgraphia, dyslexia, and dyscalculia.

Children with  dysgraphia  have a learning disability that results in a struggle to write legibly. The physical task of writing with a pen and paper is extremely challenging for the person. These children often have extreme difficulty putting their thoughts down on paper (Smits-Engelsman & Van Galen, 1997). This difficulty is inconsistent with a person’s IQ. That is, based on the child’s IQ and/or abilities in other areas, a child with dysgraphia should be able to write, but can’t. Children with dysgraphia may also have problems with spatial abilities.

Students with dysgraphia need academic accommodations to help them succeed in school. These accommodations can provide students with alternative assessment opportunities to demonstrate what they know (Barton, 2003). For example, a student with dysgraphia might be permitted to take an oral exam rather than a traditional paper-and-pencil test. Treatment is usually provided by an occupational therapist, although there is some question as to how effective such treatment is (Zwicker, 2005).

Dyslexia is the most common learning disability in children. An individual with  dyslexia  exhibits an inability to correctly process letters. The neurological mechanism for sound processing does not work properly in someone with dyslexia. As a result, dyslexic children may not understand sound-letter correspondence. A child with dyslexia may mix up letters within words and sentences—letter reversals, such as those shown in  Figure 7.17 , are a hallmark of this learning disability—or skip whole words while reading. A dyslexic child may have difficulty spelling words correctly while writing. Because of the disordered way that the brain processes letters and sounds, learning to read is a frustrating experience. Some dyslexic individuals cope by memorizing the shapes of most words, but they never actually learn to read (Berninger, 2008).

Two columns and five rows all containing the word “teapot” are shown. “Teapot” is written ten times with the letters jumbled, sometimes appearing backwards and upside down.

Dyscalculia

Dyscalculia  is difficulty in learning or comprehending arithmetic. This learning disability is often first evident when children exhibit difficulty discerning how many objects are in a small group without counting them. Other symptoms may include struggling to memorize math facts, organize numbers, or fully differentiate between numerals, math symbols, and written numbers (such as “3” and “three”).

Additional Supplemental Resources

  • Use Google’s QuickDraw web app on your phone to quickly draw 5 things for Google’s artificially intelligent neural net. When you are done, the app will show you what it thought each of the drawings was. How does this relate to the psychological idea of concepts, prototypes, and schemas? Check out here.  Works best in Chrome if used in a web browser
  • This article lists information about a variety of different topics relating to speech development, including how speech develops and what research is currently being done regarding speech development.
  • The Human intelligence site includes biographical profiles of people who have influenced the development of intelligence theory and testing, in-depth articles exploring current controversies related to human intelligence, and resources for teachers.

Preview the document

  • In 2000, psychologists Sheena Iyengar and Mark Lepper from Columbia and Stanford University published a study about the paradox of choice.  This is the original journal article.
  • Mensa , the high IQ society, provides a forum for intellectual exchange among its members. There are members in more than 100 countries around the world.  Anyone with an IQ in the top 2% of the population can join.
  • This test developed in the 1950s is used to refer to some kinds of behavioral tests for the presence of mind, or thought, or intelligence in putatively minded entities such as machines.
  • Your central “Hub” of information and products created for the network of Parent Centers serving families of children with disabilities.
  • How have average IQ levels changed over time? Hear James Flynn discuss the “Flynn Effect” in this Ted Talk. Closed captioning available.
  • We all want customized experiences and products — but when faced with 700 options, consumers freeze up. With fascinating new research, Sheena Iyengar demonstrates how businesses (and others) can improve the experience of choosing. This is the same researcher that is featured in your midterm exam.
  • What does an IQ Score distribution look like?  Where do most people fall on an IQ Score distribution?  Find out more in this video. Closed captioning available.
  • How do we solve problems?  How can data help us to do this?  Follow Amy Webb’s story of how she used algorithms to help her find her way to true love. Closed captioning available.
  • In this Ted-Ed video, explore some of the ways in which animals communicate, and determine whether or not this communication qualifies as language.  A variety of discussion and assessment questions are included with the video (free registration is required to access the questions). Closed captioning available.
  • Watch this Ted-Ed video to learn more about the benefits of speaking multiple languages, including how bilingualism helps the brain to process information, strengthens the brain, and keeps the speaker more engaged in their world.  A variety of discussion and assessment questions are included with the video (free registration is required to access the questions). Closed captioning available.
  • This video is on how your mind can amaze and betray you includes information on topics such as concepts, prototypes, problem-solving and mistakes in thinking. Closed captioning available.
  • This video on language includes information on topics such as the development of language, language theories, and brain areas involved in language, as well as language disorders. Closed captioning available.
  • This video on the controversy of intelligence includes information on topics such as theories of intelligence, emotional intelligence, and measuring intelligence. Closed captioning available.
  • This video on the brains vs. bias includes information on topics such as intelligence testing, testing bias, and stereotype threat. Closed captioning available.

Access for free at  https://openstax.org/books/psychology-2e/pages/1-introduction

Introduction to Psychology Copyright © 2020 by Julie Lazzara is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

BRIEF RESEARCH REPORT article

Relation between metacognitive strategies, motivation to think, and critical thinking skills.

Carlos J. Ossa

  • 1 Educations Science Department, University of the Bío Bío, Concepción, Chile
  • 2 Psychology Faculty, University of Salamanca, Salamanca, Spain

Critical thinking is a complex reasoning skill, and even though it is hard to reach a consensus on its definition, there is agreement on it being an eminently cognitive skill. It is strongly related with reflective and metacognitive skills, as well as attitudinal or motivational aspects, although no model has yet been able to integrate these three elements. We present herein the preliminary results of a study seeking to establish these relations, in a sample of Chilean university students. 435 students from three universities participated, of which 88 were men, 333 were women, and 14 did not indicate their gender. Their ages ranges between 18 and 51 years old ( M  = 21, SD = 3.09). Three instruments were applied, one to measure metacognitive strategies, one to measure motivation to critical thinking, and a third to measure critical thinking skills. The relation was analyzed via structural equations. The results show a positive, strong, and significant relation between metacognition and motivation to think. However, only a weak significant relation was observed between motivation to think and critical thinking, and no direct relation was found between metacognition and critical thinking. We hypothesize a significant but moderate relation between the variables, where metacognition influences motivation to think, which in turn influences critical thinking skills. Factors are discussed which could negatively affect the studied relations, as well as the importance of generating integrated models between the three variables, as they would show a theoretical and empirical link.

Introduction

Critical thinking is a relevant topic for the 21st century, highlighted by Unesco as one of the skills to develop among students to properly face the challenges of this century ( Scott, 2015 ). Despite its importance for human development, its implementation in educational curricula has been difficult to carry out, both at the level of school systems and in higher education systems ( Ossa et al., 2018 ; Silva Pacheco, 2019 ).

This difficulty of incorporating critical thinking into the educational process may be related with the complexity of the task. On one side, there is discussion as to whether the process can be taught as a skill, or whether it is more of a facet of thinking which can only be stimulated in a concrete way ( Saiz, 2017 ). Building on this factor, the complexity of the matter is also expressed in the attempts at defining the process, since there are various definitions of critical thinking. These definitions present different natures, ranging from only cognitive reasoning processes; cognitive and metacognitive processes; cognitive, metacognitive and attitudinal processes; and finally, cognitive, metacognitive, attitudinal, and social agency processes ( Montero, 2010 ; Rivas and Saiz, 2012 ; Ossa and Díaz, 2017 ; Saiz, 2017 ).

As society and socio-cultural challenges have become more complex, it is necessary to adopt more complex perspectives on human processes. Critical thinking perspectives which help integrate diverse processes could be more pertinent for the effective development of this skill among people ( Paul and Elder, 2003 ).

Critical thinking has been linked to different skills, both cognitive and non-cognitive, for example, problem solving, scientific reasoning, motivation, metacognition, and now ultimately creativity ( Saiz and Rivas, 2008 ; Tamayo-Alzate et al., 2019 ; Halpern and Dunn, 2021 ; Muñoz and Ruiz, 2022 ; Santana et al., 2022 ). Of these skills, problem solving has been incorporated as a constituent element of critical thinking in some models; Likewise, motivation and metacognition are closely related factors and it has been proposed that they are satellite skills for critical thinking processes ( Valenzuela and Nieto, 2008 ; Rivas and Saiz, 2011 ; García, 2022 ), although no empirical information has been shown to clearly demonstrate this. The objective of this paper is precisely to show the relationship between motivation to think and metacognition with critical thinking, in order to contribute to what is proposed.

Critical thinking, motivation, and metacognition

Even when critical thinking is a broadly used concept in the academic and educational world, with a wide range of studies in the last decade, it continues to be a difficult phenomenon to conceptualize and to create little consensus ( Ossa et al., 2016 ; Saiz, 2017 ; Díaz et al., 2019 ).

It is conceptualized as a cognitive mechanism which filters information about the ideological intentions accompanying said information, via continual questioning of knowledge production practices, and the recognition of its different perspectives ( Yang and Chung, 2009 ; Montero, 2010 ).

It is a type of thinking oriented toward data and action, in a context of solving problems and interacting with other people ( Daniel and Auriac, 2012 ; López, 2012 ). Critical thinking is self-directed, self-disciplined, self-regulated and self-corrected. It involves undergoing rigorous standards of excellence and a conscious dominion of its use. It also implies effective communication and the development of problem solving skills ( Saiz and Rivas, 2008 , 2012 , 2016 ).

Critical thinking is characterized by generating higher-level cognitive processing in people, centered on the skills of reflecting, comprehension, evaluation and creation. It therefore requires high intellectual development. However, it is also a skill which can be developed, since there are no important differences between people with average and high intellectual levels with regards to developing critical thinking ( Sierra et al., 2010 ).

Since critical thinking is a high-level cognitive process, and the ability to generate an elaborated thought, a close relation has been proposed with elements which are not considered merely cognitive, including metacognition ( Rivas et al., 2022 ). Metacognition is a reflective process which helps deepen thought, regulate, and generate consciousness about thought ( Tamayo-Alzate et al., 2019 ; Drigas and Mitsea, 2020 ). It has been worked on as both a reflective process of self-knowledge, and as a skill which helps develop other cognitive processes including memory, learning, or even intelligence, since different levels of application can be established in its use ( Drigas and Mitsea, 2021 ).

There is evidence that metacognitive strategies can influence critical thinking and its components. For one, it improves the use of metacognitive strategies due to intervention in critical thinking. It also improves the use of critical thinking with metacognitive strategies in interventions done with psychology students at universities ( Ossa et al., 2016 ; Rivas et al., 2022 ). Significant and positive relations have also been found between critical thinking and metacognitive consciousness among medical students, although not for regulation and knowledge tasks ( de la Portilla Maya et al., 2022 ).

In this way, we can observe a relative influence on the way that people think about thinking, since metacognition supports decision making and final evaluation about strategies to resolve problems ( Rivas et al., 2022 ).

Some authors also indicate the presence of another non-cognitive component in critical thinking, which is disposition or motivation ( Facione et al., 2000 ; Saiz and Rivas, 2008 ; Marin and Halpern, 2011 ; Valenzuela et al., 2014 ; Halpern and Dunn, 2023 ). This component is fundamental to achieve this skill, since even when the indicated cognitive functions are available, if people either lack the desire to apply critical thinking or deem it inconvenient to do so, critical thinking will not be adequately manifested ( Valenzuela and Nieto, 2008 ; Valenzuela et al., 2014 ).

This non-cognitive element is based on human attitudes or motivations which complement the use of critical thinking, allowing it to be better developed, since they drive personal improvement ( Boonsathirakul and Kerdsomboon, 2021 ). The factors presented as facets of a disposition toward critical thinking include seeking truth, open-mindedness, being analytical, systematicity, curiosity, self-confidence and maturity (Facione, in Boonsathirakul and Kerdsomboon, 2021 ).

However, considering these non-cognitive elements as dispositions of a being also involves assuming certain personality traits or dimensions of values which cannot always be adequately measured. They should thus be considered more as motivational aspects, since they could be better defined and with a greater possibility of modification, given that they are more related with behavioral and perceptual elements ( Valenzuela et al., 2014 , 2023 ). From this perspective, we understand that non-cognitive components are based on the expectations and value given to the task. In this way, we establish a direct and causal relation between motivation and critical thinking, where the former explains critical thinking development by between 8 and 17%, according to the instrument used to measure it ( Valenzuela et al., 2023 ).

In this way, promoting motivational aspects is a relevant factor for developing cognitive and metacognitive processes, since complex processes are exhausting and require a high and constant investment of cognitive and emotional factors ( Valenzuela and Nieto, 2008 ; Valenzuela and Saiz, 2010 ; Gaviria, 2019 ; Nieto-Márquez et al., 2021 ).

Finally, a relative relation has been noted between motivational processes and metacognitive strategies. Correa et al. (2019) performed an evaluation among Chilean high school students about the use of metacognitive strategies and motivation to critical thinking in bias recognition. They found a positive, significant, and medium-intensity correlation ( r  = 0.50, p  < 0.001) between both variables, which indicates that cognitive and non-cognitive factors have a relevant link for human thought.

With the aforementioned background, we can hypothesize the existence of a significant and positive relation between critical thinking, metacognitive strategies, and motivation to think critically; that motivation to think directly affects critical thinking; and those metacognitive strategies are related with both variables.

In this article it will be showed preliminary results from this relation, presenting a relational model based on structural equations which would allow for establishing direct and mediated relations between said variables.

A correlational study was done via structural equations.

Participants

435 students from pedagogy majors at three Chilean universities participated in the study. Of these, 88 were male (20.2%), 333 were female (76.6%), 7 were students of unidentified gender (1.6%), and 7 did not respond (1.6%). Students’ ages fell between 18 and 51 years ( M  = 21, SD = 3.09). The careers to which the students belong are in the area of pedagogy, in specialties of mathematics (22%), history (8%), science (15%), special education (15%), and early childhood education (40%).

Instruments

For this study, a battery with three instruments was applied:

1. Metacognitive strategy questionnaire from O’Neil and Abedi, adapted into Spanish by Martínez (2007) . This measure metacognitive strategies applied to different academic tasks. There are 20 items organized into three dimensions: self-knowledge (referring to metacognitive consciousness), self-regulation (referring to metacognitive control), and evaluation (referring to global task evaluation). Results are recorded with a Likert-type scale of 5 choices (0 to 4 points). This instrument has been applied to Chilean university students and shown adequate reliability indicators. The global Cronbach’s α was 0.87, and for the dimensions it was between 0.62 and 0.65 ( Correa et al., 2019 ).

2. Critical thinking motivation questionnaire from Valenzuela, measuring the intention of applying thinking to knowledge tasks, based on personal expectations and the value of the task. It contains 19 items organized into 5 dimensions: Expectation ( α  = 0.774), Importance ( α  = 0.770), Cost ( α  = 0.775), Utility ( α  = 0.790) and Interest ( α  = 0.724). Its results are recorded based on a Likert-type scale with 5 alternatives (0–4 points). It has been applied to Chilean university students with strong reliability indicators. The global Cronbach’s α was 0.92, and the values for its dimensions ranged from 0.69 to 0.83 ( Valenzuela and Nieto, 2008 ; Correa et al., 2019 ).

3. Critical thinking task test from Miranda, adapted by Palma Luengo et al. (2021) . This measured the capacity to apply cognitive critical thinking processes to socio-scientific topics. It contains 15 items organized into three dimensions: inquiry (referring to identifying useful information), analysis (referring to the decision to use pertinent and reliable data), and arguing (referring to providing arguments with useful and reliable data). Its results are recorded with a sequence of scores ranging from 0 to 3 points, based on a performance rubric. It has been applied to a sample of Chilean university students with moderately adequate reliability indicators. The overall Cronbach’s α was 0.67, with moderately low values in its dimensions ranging from 0.47 to 0.60 ( Palma Luengo et al., 2021 ).

Three metacognition questions were incorporated into this instrument to reflect on the tasks being done, one for each dimension (e.g., How are you so confident about knowing how to do the activity? ). Two questions about motivation to thinking were also included, in the middle and at the end of the test, seeking to analyze whether there was a disposition to answer a question in a more voluntary form (e.g., Do you want to finish the test here or do you want to continue to delve deeper into the topic? ). The overall Cronbach’s α was 0.78 (five dimensions), and the values were moderately adequate within these dimensions (0.54 for metacognition and 0.73 for motivation).

We made contact with the directors of the pedagogy majors at three different universities, coordinating the process and determining the courses to consider. After this, a talk was carried out in each course, inviting students to participate in the study. Written informed consent was incorporated into the survey, indicating the study objectives and describing the anonymous and voluntary nature of participation. Open consultations were made about participation in applying the surveys, applying the battery of instruments only to those who wished to participate.

After answering the instruments, the data was emptied into a digital database and analyzed with SPSS v.27 and RStudio software. For data analysis, we used inferential and multivariate statistics. For all inference effects, a 5% significance threshold has been considered. In the structural models, we applied formats from Partial Least Squares (SEM-PLS).

We present an application of structural equations based on partial least squares (PLS), designed to model behavioral situations and social sciences. According to Wold (1980) it is fairly flexible, since it is useful for small sample sizes and also does not require distributional assumptions for the variables, along with being useful for predictive analysis as well as theoretical confirmation. With the PLS format, there are three methodological considerations which are relevant for application: (i) choosing variable with items that effectively belong, (ii) valuing items’ reliability and validity, and (iii) properly interpreting the coefficients.

As indicated in this type of modeling, there are two sections. The first is the measurement model, where each dimension is formatively related with its items: i.e., the item contributes to the variable with a certain coefficient called weight ( w ). This factorial weight represents the weighting of the dimension regarding the latent variable which it intends to measure, so that we can expect it to have sufficient magnitude to be statistically significant.

To begin, for the Metacognition variable, the scores for Self-Knowledge ( w  = 0.67, p  < 0.001, 95% IC: 0.41; 0.97) and Evaluation ( w  = 0.34, p  < 0.01, 95% IC: 0.12; 0.56) are relevant for generating the latent indicator. For the Motivation variable, the scores for Expectations ( w  = 0.21, p  < 0.05, 95% IC: 0.25; 0.62), Importance ( w  = 0.43, p  < 0.001, 95% IC: 0.14; 0.60), and Usefulness ( w  = 0.39, p  < 0.001, 95% IC: 0.19; 0.25) are representative when generating this indicator. For Critical Thinking, only the Metacognition indicator ( w  = 0.71, p  < 0.05, 95% IC: 0.56; 0.86) turned out to be appropriate.

The second section of this type of models is called the structural model. It shows the causality relations between the latent variables. Schematically, we consider that a variable X is the cause of another variable Y, and an arrow will go from X to Y. For this study, the relational schematic between variables is given by the following hypothesis set:

H1 : There is a positive effect of the Metacognition Strategy (ME) on Critical Thinking Motivation (MO). H2 : There is a positive effect of Metacognition Strategy (ME) on Critical Thinking (PC). H3 : There is a positive effect of Critical Thinking Motivation (MO) on Critical Thinking (PC).

Figure 1 shows the hypotheses combined with their respective variables, indicating the measurement and structural models.

www.frontiersin.org

Figure 1 . Schematic of hypothesis and effects expected. Structural equation model. Source: authors.

The empirical results from the model appear in Table 1 with their significance level.

www.frontiersin.org

Table 1 . Structural equation model results.

Finally, in the structural model ( Figure 2 ), we can see the fulfillment of hypothesis H1 ( B  = 0.56, p  < 0.001, 95% IC: 0.49; 0.63) where a greater perception of Metacognition leads to a greater level of Critical Thinking Motivation. There is also fulfillment for hypothesis H3 ( B  = 0.21, p  < 0.01, 95% IC: 0.06; 0.34) indicating that greater levels of Critical Thinking Motivation lead to a greater level of Critical Thinking.

www.frontiersin.org

Figure 2 . Results schematic. Structural equations model. Source: authors.

Our preliminary study results show ties between the three variables, as indicated both in theory ( Facione et al., 2000 ; Valenzuela and Nieto, 2008 ; Tamayo-Alzate et al., 2019 ) and in other studies ( Correa et al., 2019 ; Rivas et al., 2022 ; Valenzuela et al., 2023 ). However, we found some disparate data with regards to the latter points.

For the structural models, hypotheses H1 and H3 have been fulfilled, reporting statistically significant evidence that greater perceived Metacognition explains a greater level of Critical Thinking Motivation, a greater level of Critical Thinking Motivation implies a higher level of Critical Thinking.

One important aspect here is that a significant relation was found between motivation and critical thinking skills, which is supported by Valenzuela et al. (2023) . While the value of the relation is moderate, it can be related, as presented in the aforementioned study, and may be due to the type of instrument used to measure critical thinking. One notable aspect is that the motivation question incorporated into the critical thinking task instrument had little weight within this instrument. However, this could be explained because the questions sought to consider effort for the task. Reviewing the components of the critical thinking motivation survey, the dimensions with the strongest ties were those oriented towards expectations, usefulness and importance, not effort or energy costs.

It is possible that the relationship between metacognition and motivation to think is established because, from the theoretical model used ( Valenzuela and Nieto, 2008 ; Valenzuela et al., 2014 ), the expectation of the task, and its assessment of usefulness (aspects motivation), require an evaluation process (metacognitive aspect); However, this idea must be deepened and reviewed in more detail.

Considering metacognition, no direct relation was observed between the instrument used in this study to measure the metacognitive strategies of self-knowledge, self-regulation and evaluation on one hand, and critical thinking on the other. This situation goes against other studies’ findings ( de la Portilla Maya et al., 2022 ; Rivas et al., 2022 ), and may be explained by the type of instrument used, which may not be sensitive to the critical thinking tasks measured by the test from Palma Luengo et al. (2021) .

The relation discovered about metacognition supporting critical thinking motivation, in order to thus achieve better critical thinking, is one of the key relevant findings in this study. It implies that reflecting on oneself and tasks can generate greater expectations and evaluation for the task, which can drive better performance. These results still need more breadth and depth from further research.

This study is only a preliminary report of results, to account for the relationship between the aforementioned variables and propose that critical thinking benefits from metacognitive and motivational work. Its limitations are the fact that its objective was only empirical, in order to account for the relationship raised in studies ( Valenzuela and Nieto, 2008 ), so the theoretical depth was less. On the other hand, there was a limited number of participating students, and only from some university majors. Likewise, it is considered that the critical thinking test that was used presents adequate reliability values overall, but with less powerful values in some of its dimensions (specifically, inquiry and motivation). It is considered necessary to replicate the study with another instrument and a larger sample to more fully support the results found.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by Pedro Labraña Research Unit of Bio-Bio University. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author contributions

CO: Conceptualization, Methodology, Project administration, Writing – original draft. SR: Investigation, Supervision, Writing – review & editing. CS: Conceptualization, Methodology, Writing – review & editing.

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This article was developed in part with funding from FONDECYT Project 11220056, from the Chilean National Research and Development Agency (ANID).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Boonsathirakul, J., and Kerdsomboon, C. (2021). The investigation of critical thinking disposition among Kasetsart University students. High. Educ. Stud. 11, 224–232. doi: 10.5539/hes.v11n2p224

CrossRef Full Text | Google Scholar

Correa, J., Ossa, C., and Sanhueza, P. (2019). Sesgo en razonamiento, metacognicion y motivación al pensamiento crítico en estudiantes de primer año medio de un establecimiento de Chillán. Rev. Estud. Exp. Educ. 18, 61–77. doi: 10.21703/rexe.20191837correa8

Daniel, M., and Auriac, E. (2012). Philosophy, critical thinking and philosophy for children. Educ. Philos. Theory 43, 415–435. doi: 10.1111/j.14695812.2008.00483.x

de la Portilla Maya, S. R., Duque Dussán, A. M., Landínez Martínez, D. A., Montoya Londoño, D. M., and Gutiérrez De Blume, A. P. (2022). Pensamiento crítico y conciencia metacognitiva en una muestra de estudiantes de Medicina. Latinoamericana de Estudios Educativos 18, 145–168. doi: 10.17151/rlee.2022.18.1.8

Díaz, C., Ossa, C., Palma, M., Lagos, N., and Boudon, J. (2019). El concepto de pensamiento crítico según estudiantes chilenos de pedagogía. Sophia 26, 267–288.

Google Scholar

Drigas, A., and Mitsea, E. (2020). The 8 pillars of metacognition. Int. J. Emerg. Technol. Learn. 15, 162–178. doi: 10.3991/ijet.v15i21.14907

Drigas, A., and Mitsea, E. (2021). 8 pillars X 8 layers model of metacognition: educational strategies, exercises and trainings. Int. J. Online Biomed. Eng. 17, 115–134. doi: 10.3991/ijoe.v17i08.23563

Facione, P. A., Facione, N. C., and Giancarlo, C. A. (2000). The disposition toward critical thinking: its character, measurement, and relationship to critical thinking. Informal Logic 20, 61–84. doi: 10.22329/il.v20i1.2254

García, E. J. A. (2022). Motivación, pensamiento crítico y metacognición:¿ esenciales para aprender? Reflexiones sobre calidad educativa. Rev. Dialog. 7, 79–88. doi: 10.37594/dialogus.v1i7.527

Gaviria, C. (2019). Pensar la Historia con el Deseo: Metacognición, Motivación y Comprensión Histórica. Rev. Colomb. Psicol. 28, 147–164. doi: 10.15446/rcp.v28n1.70763

Halpern, D. F., and Dunn, D. (2021). Critical thinking: a model of intelligence for solving real-world problems. J. Intelligence 9:22. doi: 10.3390/jintelligence9020022

PubMed Abstract | CrossRef Full Text | Google Scholar

Halpern, D. F., and Dunn, D. (2023). Thought and knowledge. An introduction to critical thinking . 6ª Edn. New york: Taylor and Francis.

López, G. (2012). Pensamiento crítico en el aula. Docencia e Investigación XXXVII, 41–60.

Marin, L., and Halpern, D. (2011). Pedagogy for developing critical thinking in adolescents: explicit instruction produces greatest gains. Think. Skills Creat. 6, 1–13. doi: 10.1016/j.tsc.2010.08.002

Martínez, J. (2007). Concepción de aprendizaje y estrategias metacognitivas en estudiantes universitarios de psicología. Anal. Psicol. 23, 7–16.

Montero, M. (2010). Crítica, autocrítica y construcción de teoría en la psicología social latinoamericana. Rev. Colomb. Psicol. 19, 177–191.

Muñoz, C., and Ruiz, A. (2022). Programa estratégico lector para desarrollar el pensamiento crítico-creativo en estudiantes de secundaria. Revista Innova Educación 4, 159–175. doi: 10.35622/j.rie.2022.02.010.es

Nieto-Márquez, N., García-Sinausía, S., and Pérez Nieto, M. (2021). Relaciones de la motivación con la metacognición y el desempeño en el rendimiento cognitivo en estudiantes de educación primaria. Anales de psicología 37, 51–60. doi: 10.6018/analesps.383941

Ossa, C., and Díaz, A. (2017). Enfoques intraindividual e interindividual en programas de pensamiento crítico. Psicol. Esc. Educ. 21, 593–600. doi: 10.1590/2175-353920170213111121

Ossa, C., Lepe, N., Díaz, A., Merino, J., and Larraín, A. (2018). Programas de pensamiento crítico en la formación de docentes Iberoamericanos. Profesorado 22, 443–462. doi: 10.30827/profesorado.v22i4.8432

Ossa, C., Rivas, S.F., and Saiz, C. (2016). Estrategias metacognitivas en el desarrollo del análisis argumentativo. En:J. Casanova and C. Bisinoto Y L. Almeida. IV Seminário Internacional Cognição, aprendizagem e desempenho. Livro de atas Braga: Livro de atas (pp. 30–47).

Palma Luengo, M., Ossa Cornejo, C., Ahumada Gutiérrez, H., Moreno Osorio, L., and Miranda Jaña, C. (2021). Adaptación y validación del test Tareas de Pensamiento Crítico en estudiantes universitarios. Rev. Estud. Exp. Educ. 20, 199–212. doi: 10.21703/rexe.20212042palma12

Paul, R., and Elder, L. (2003). La mini-guía para el Pensamiento crítico . Conceptos y herramientas. Ed. Fundación para el Pensamiento Crítico. Available at: http://www.criticalthinking.org

Rivas, S. F., and Saiz, C. (2012). Validación y propiedades psicométricas de la prueba de pensamiento crítico PENCRISAL. Rev. Electrón. Metodol. Aplic. 17, 18–34.

Rivas, S. F., Saiz, C., and Ossa, C. (2022). Metacognitive strategies and development of critical thinking in higher education. Front. Psychol. 13:913219. doi: 10.3389/fpsyg.2022.913219

Saiz, C. (2017). Pensamiento Crítico y Cambio . Madrid: Pirámide.

Saiz, C., and Rivas, S. F. (2008). Evaluación en pensamiento crítico: una propuesta para diferenciar formas de pensar. Ergo. Nueva Época 22-23, 25–66.

Saiz, C., and Rivas, S. F. (2011). Evaluation of the ARDESOS programs: an initiative to improve critical thinking skills. J. Scholarsh. Teach. Learn. 11, 34–51.

Saiz, C., and Rivas, S. F. (2012). Pensamiento crítico y aprendizaje basado en problemas cotidianos. Rev. Docen. Universit. 10, 325–346. doi: 10.4995/redu.2012.6026

Saiz, C., and Rivas, S. F. (2016). New teaching techniques to improve critical thinking. DIAPROVE. Methodol. 40, 3–36.

Santana, L. M. Q., Cedeño, B. J. B., Atoche, C. B., Torres, C. V. G., Preciado, M. P. U., and Quito, C. R. M. (2022). Estrategias metacognitivas y pensamiento crítico en docentes. Ciencia Latina Revista Científica Multidisciplinar 6, 649–675. doi: 10.37811/cl_rcm.v6i1.1529

Scott, C. L. (2015). El futuro del aprendizaje 2 ¿Qué tipo de aprendizaje se necesita en el siglo XXI? Investigación y Prospectiva en Educación UNESCO, París. [Documentos de Trabajo ERF, No. 14]. Available at: https://unesdoc.unesco.org/ark:/48223/pf0000242996_spa

Sierra, J., Carpintero, E., and Pérez, L. (2010). Pensamiento crítico y capacidad intelectual. Faísca 15, 98–110.

Silva Pacheco, C. (2019). El desarrollo del pensamiento crítico en la propuesta curricular de la educación del arte en Chile. Estud. Pedagóg. 45, 79–92. doi: 10.4067/S0718-07052019000300079

Tamayo-Alzate, O., Cadavid-Alzate, V., and Montoya-Londoño, D. (2019). Análisis metacognitivo en estudiantes de básica, durante la resolución de dos situaciones experimentales en la clase de ciencias Naturales. Rev. Colomb. Educ. 1, 117–141. doi: 10.17227/rce.num76-4188

Valenzuela, J., and Nieto, A.M. (2008). Motivación y Pensamiento Crítico: Aportes para el estudio de esta relación. Available at: http://reme.uji.es/articulos/numero28/article3/article3.pdf

Valenzuela, J., Nieto, A. M., and Muñoz, C. (2014). Motivación y disposiciones: enfoques alternativos para explicar el desempeño de habilidades de pensamiento crítico. Rev Electrón. Investig. Educat. 16, 16–32.

Valenzuela, J., Nieto, A., Ossa, C., Sepúlveda, S., and Muñoz, C. (2023). Relaciones entre factores motivacionales y pensamiento crítico. Eur. J. Educat. Psychol. 16, 1–18. doi: 10.32457/ejep.v16i1.2077

Valenzuela, J., and Saiz, C. (2010). Percepción sobre el coste de pensar críticamente en universitarios chilenos y españoles. Electron. J. Res. Educ. Psychol. 8, 689–706.

Wold, H. (1980). “Model construction and evaluation when theoretical knowledge is scarce,” in Evaluation of econometric models . eds. J. Kmnta and J. Ramsey (Cambridge: Academic press).

Yang, S. C., and Chung, T. Y. (2009). Experimental study of teaching critical thinking in civic education in Taiwanese junior high school. Br. J. Educ. Psychol. 79, 29–55. doi: 10.1348/000709907X238771

Keywords: critical thinking, structural models, cognition, motivation, pedagogy

Citation: Ossa CJ, Rivas SF and Saiz C (2023) Relation between metacognitive strategies, motivation to think, and critical thinking skills. Front. Psychol . 14:1272958. doi: 10.3389/fpsyg.2023.1272958

Received: 04 August 2023; Accepted: 13 November 2023; Published: 04 December 2023.

Reviewed by:

Copyright © 2023 Ossa, Rivas and Saiz. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Carlos J. Ossa, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Kristen A. Carter MS

Metacognition’s Role in Decision Making

Metacognition can help us to think outside the box..

Posted February 14, 2024 | Reviewed by Hara Estroff Marano

  • Research tells us that making creative decisions is not necessarily related to intelligence.
  • We can use metacognition to draw from a wide range of problem-solving strategies.
  • Metacognition is a cognitive skill that can be taught and nurtured.

Source: ismagilov | iStock

We make decisions all day long. Some of them are based on careful consideration, some are based on past experiences, and some just seem to come without much thought. Decisions come in all forms. Results can be good, bad, or unclear.

Research tells us that our decision-making ability is not necessarily linked to intelligence but rather to personality , motivation , and willingness to learn. We all have goals and we want to find a way to reach them.

More complex decisions require problem-solving, strategies, re-framing, creative thinking , and possibly seeking advice from others. In addition, there often is the matter of evaluating the difficulty of the task. Is it within or beyond a person’s perceived capabilities?

There is another key player in the mix when it comes to making effective decisions and following up with appropriate actions. It has to do with being able to reflect on one’s thinking and make adjustments that bring about the desired outcome.

The intricacies of how we make decisions are directly related to our facility of metacognition . Metacognition is often referred to as the ability to “think about our thinking.” It includes knowledge about oneself and the ability to select effective strategies, as well as being able to evaluate task performance. Importantly, it includes knowledge about oneself as a learner. Can the person trust their abilities to evaluate all phases of the decision-making process?

An everyday example

Let’s bring this down to what this can look like when making daily decisions that may affect health and well-being,

There is a person whose goal is to eat healthier and lose weight. He/she has decided that there is a specific number on the scale that matters, and a restrictive diet has been chosen. Let’s say that person is then confronted with making choices at a dinner buffet. They can select a small plate of items that are part of the program or a large plate filled with favorites as well as a plan to go back for dessert. It’s decision time!

The person can say to themselves, “Well, just this once, I am going to go for it. I will be better with my eating tomorrow.” Or they may say, “OK, I am going to garner all my willpower and do the right thing here.”

Alternatively, using metacognition would look like this: The person says, “Uh oh, this is a situation that is challenging for me. I can reframe this and come up with a creative solution. I do not have to think about this as my last chance to pig out. I can be more selective and choose what will please me most, using reasonable portion size as a guide. That way, I can enjoy the experience and still reach my goals.” This person is evaluating and shifting their thoughts in order to achieve their goal, rather than trying to following some rules.

This is a fairly simplistic example, but it describes a scenario that is fairly common.

Learning about metacognition

Why don’t people take advantage of metacognition more often, in this context and others?

In spite of that simplistic example, there are some complexities here. Metacognition is a vital part of being able to think creatively, as seen in the example. Research by Akcaoglu, Mor and Kulekci (2023) indicates that, “As a skill, metacognitive awareness is one of the core components of self-regulated learning.” The example above shows how metacognition links to self-regulated learning via creative thinking, curiosity, and willingness to learn.

Notice that the expression is metacognitive awareness. Not everyone has that awareness. This came to the attention of John Flavell in the 1970s when he was first formulating the concept of metacognition. His focus at the time was educational psychology.

Broadly speaking, metacognition is a skill like other cognitive skills in that some people have more of it than others. For many people, developing the skill comes from having been exposed to the concept and learning how to use it.

Flavell envisioned an educational system that describes metacognition, and supports development of it. He indicated that metacognition includes awareness that a person’s beliefs about themselves affect their learning process. Additionally, metacognitions may not be correct. Being able to evaluate the thought process and results is important. According to Flavell, these features can become part of the educational process.

Stimulating metacognition

In line with Flavell’s observations, there are ways to encourage metacognition by asking certain questions. Here are some examples:

  • Are you aware that we all have habits around how we think?
  • What are your beliefs about how difficult this task is going to be?
  • Have you used some creative strategies for this challenge in the past? What were they?
  • Sometimes, we evaluate our mistakes so that we can learn from them. Could you do that with this process?

human thinking and problem solving

The point here is to stimulate new thoughts that are outside the usual box. Also to see the potential to do so, and the benefits. Ultimately, the goal is nurture metacognition skills when devising solutions to a problem, whether it is healthier eating or something else.

Flavell, J. (1979) Theories of Learning in Educational Psychology. American Psychologist. 34: 906-911.

Akcaoglu, M.O.. Mor, E., Kulekci, E. (2023). The mediating role of metacognitive awareness in the relationship between critical thinking and self-regulation. Thinking Skills and Creativity. 47:101187.

Basu, S & Dixit, S. (2022). Role of metacognition in explaining decision-making styles: A study of knowledge about cognition and regulation of cognition. Personality and Individual Differences. 185:111318.

Kristen A. Carter MS

Kristen Carter, M.S., is an exercise physiologist and the author of The End of Try Try Again: Overcome Your Weight Loss and Exercise Struggles for Good.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

January 2024 magazine cover

Overcome burnout, your burdens, and that endless to-do list.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 June 2023

Impact of artificial intelligence on human loss in decision making, laziness and safety in education

  • Sayed Fayaz Ahmad 1 ,
  • Heesup Han   ORCID: orcid.org/0000-0001-6356-3001 2 ,
  • Muhammad Mansoor Alam 3 ,
  • Mohd. Khairul Rehmat 4 ,
  • Muhammad Irshad 5 ,
  • Marcelo Arraño-Muñoz 6 &
  • Antonio Ariza-Montes   ORCID: orcid.org/0000-0002-5921-0753 7  

Humanities and Social Sciences Communications volume  10 , Article number:  311 ( 2023 ) Cite this article

81k Accesses

5 Citations

71 Altmetric

Metrics details

  • Information systems and information technology

A Correction to this article was published on 29 June 2023

This article has been updated

This study examines the impact of artificial intelligence (AI) on loss in decision-making, laziness, and privacy concerns among university students in Pakistan and China. Like other sectors, education also adopts AI technologies to address modern-day challenges. AI investment will grow to USD 253.82 million from 2021 to 2025. However, worryingly, researchers and institutions across the globe are praising the positive role of AI but ignoring its concerns. This study is based on qualitative methodology using PLS-Smart for the data analysis. Primary data was collected from 285 students from different universities in Pakistan and China. The purposive Sampling technique was used to draw the sample from the population. The data analysis findings show that AI significantly impacts the loss of human decision-making and makes humans lazy. It also impacts security and privacy. The findings show that 68.9% of laziness in humans, 68.6% in personal privacy and security issues, and 27.7% in the loss of decision-making are due to the impact of artificial intelligence in Pakistani and Chinese society. From this, it was observed that human laziness is the most affected area due to AI. However, this study argues that significant preventive measures are necessary before implementing AI technology in education. Accepting AI without addressing the major human concerns would be like summoning the devils. Concentrating on justified designing and deploying and using AI for education is recommended to address the issue.

Introduction

Artificial intelligence (AI) is a vast technology used in the education sector. Several types of AI technology are used in education (Nemorin et al., 2022 ). Majorly includes Plagiarism Detection, Exam Integrity (Ade-Ibijola et al., 2022 ), Chatbots for Enrollment and Retention (Nakitare and Otike, 2022 ), Learning Management Systems, Transcription of Faculty Lectures, Enhanced Online Discussion Boards, Analyzing Student Success Metrics, and Academic Research (Nakitare and Otike, 2022 ). Nowadays, Education Technology (EdTech) companies are deploying emotional AI to quantify social and emotional learning (McStay, 2020 ). Artificial intelligence, affective computing methods, and machine learning are collectively called “emotional AI” (AI). Artificial intelligence (AI) shapes our future more powerfully than any other century’s invention. Anyone who does not understand it will soon feel left behind, waking up in a world full of technology that feels more and more like magic (Maini and Sabri, 2017 ). Undoubtedly, AI technology has significant importance, and its role has been witnessed in the recent pandemic. Many researchers agree it can be essential in education (Sayed et al., 2021 ). but this does not mean it will always be beneficial and free from ethical concerns (Dastin, 2018 ). Due to this, many researchers focus on its development and use but keep their ethical considerations in mind (Justin and Mizuko, 2017 ). Some believe that although the intentions behind AI in education may be positive, this may not be sufficient to prove it ethical (Whittaker and Crawford, 2018 ).

There is a severe need to understand the meaning of being “ethical” in the context of AI and education. It is also essential to find out the possible unintended consequences of the use of AI in education and the main concerns of AI in education, and other considerations. Generally, AI’s ethical issues and concerns are innovation cost, consent issues, personal data misuse, criminal and malicious use, freedom and autonomy loss, and the decision-making loss of humans, etc. (Stahl B. C., 2021 a, 2021 b). Although, technology also enhances organizational information security (Ahmad et al., 2021 ) and competitive advantage (Sayed and Muhammad, 2015 ) and enhances customer relationships (Rasheed et al., 2015 ). Researchers are afraid that by 2030 the AI revolution will focus on enhancing benefits and social control but will also raise ethical concerns, and there is no consensus among them. A clear division regarding AI’s positive impact on life and moral standing (Rainie et al., 2021 ).

It is evident from the literature on the ethics of AI that besides its enormous advantages, many challenges also emerge with the development of AI in the context of moral values, behavior, trust, and privacy, to name a few. The education sector faces many ethical challenges while implementing or using AI. Many researchers are exploring the area further. We divide AI in education into three levels. First, the technology itself, its manufacturer, developer, etc. The second is its impact on the teacher, and the third is on the learner or student.

Foremost, there is a need to develop AI technology for education, which cannot be the basis of ethical issues or concerns (Ayling and Chapman, 2022 ). The high expectations of AI have triggered worldwide interest and concern, generating 400+ policy documents on responsible AI. Intense discussions over ethical issues lay a helpful foundation, preparing researchers, managers, policymakers, and educators for constructive discussions that will lead to clear recommendations for building reliable, safe, and trustworthy systems that will be a commercial success (Landwehr, 2015 ). But the question is, is it possible to develop an AI technology for education that will never cause an ethical concern? Maybe the developer or the manufacturer has dishonest gain from the AI technology in education. Maybe their intentions are not towards the betterment and assistance of education. Such questions come to mind when someone talks about the impact of AI in Education. Even if the development of AI technology is clear from any ethical concerns from the developer or manufacturer, there is no guarantee for the opposite view. The risk of ethical considerations will also rely upon the technical quality. Higher quality will minimize the risk but is it possible for all educational institutions to implement expensive technology of higher quality? (Shneiderman, 2021 ). Secondly, many issues may arise when teachers use AI technology (Topcu and Zuck, 2020 ). It may be security, usage, implementation, etc. Questions about security, bias, affordability, trust, etc., come to mind (IEEE, 2019 ). Thirdly, privacy, trust, safety, and health issues exist at the user level. To address such questions, a robust regulatory framework and policies are required. Still, unfortunately, no framework has been devised, no guidelines have been agreed upon, no policies have been developed, and no regulations have been enacted to address the ethical issues raised by AI in education (Rosé et al., 2018 ).

It is evident that AI technology has many concerns (Stahl B. C., 2021 a, 2021 b), and like other sectors, the education sector is also facing challenges (Hax, 2018 ). If not all the issues/problems directly affect education and learning, most directly or indirectly impact the education process. So, it is difficult to decide whether AI has a positive ethical impact on education or negative or somewhat positive or negative. The debate on ethical concerns about AI technology will continue from case to case and context to context (Petousi and Sifaki, 2020 ). This research is focused on the following three moral fears of AI in education:

Security and privacy

Loss of human decision-making

Making humans lazy.

Although many other concerns about AI exist in education, these three are the most common and challenging in the current era. Additionally, no researcher can broaden the study beyond the scope.

Theoretical discussion

Ai in education.

Technology has impacted almost every sector; reasonably, it also needs time (Leeming, 2021 ). From telecommunication to communication and health to education, it plays a significant role and assists humanity in one way or another (Stahl A., 2021 a, 2021 b). No one can deny its importance and applications for life, which provides a solid reason for its existence and development. One of the most critical technologies is artificial intelligence (AI) (Ross, 2021 ). AI has applications in many sectors, and education is one. Many AI applications in education include tutoring, educational assistance, feedback, social robots, admission, grading, analytics, trial and error, virtual reality, etc. (Tahiru, 2021 ).

AI is based on computer programming or computational approaches; questions can be raised on the process of data analysis, interpretation, sharing, and processing (Holmes et al., 2019 ) and how the biases should be prevented, which may impact the rights of students as it is believed that design biases may increase with time and how it will address concerns associated with gender, race, age, income inequality, social status, etc. (Tarran, 2018 ). Like any other technology, there are also some challenges related to AI and its application in education and learning. This paper focuses on the ethical concerns of AI in education. Some problems are related to privacy, data access, right and wrong responsibility, and student records, to name a few (Petousi and Sifaki, 2020 ). In addition, data hacking and manipulation can challenge personal privacy and control; a need exists to understand the ethical guidelines clearly (Fjelland, 2020 ).

Perhaps the most important ethical guidelines for developing educational AI systems are well-being, ensuring workplace safety, trustworthiness, fairness, honoring intellectual property rights, privacy, and confidentiality. In addition, the following ten principles were also framed (Aiken and Epstein, 2000 ).

Ensure encouragement of the user.

Ensure safe human–machine interaction and collaborative learning

Positive character traits are to be ensured.

Overloading of information to be avoided

Build an encouraging and curious learning environment

Ergonomics features to be considered

Ensure the system promotes the roles and skills of a teacher and never replaces him

Having respect for cultural values

Ensure diversity accommodation of students

Avoid glorifying the system and weakening the human role and potential for growth and learning.

If the above principles are discussed individually, many questions arise while using AI technology in education. From its design and planning to use and impact, at every stage, ethical concerns arise and are there. It’s not the purpose for which AI technology is developed and designed. Technology is advantageous for one thing but dangerous for another, and the problem is how to disintegrate the two (Vincent and van, 2022 ).

In addition to the proper framework and principles not being followed during the planning and development of AI for Education, bias, overconfidence, wrong estimates, etc., are additional sources of ethical concerns.

Security and privacy issues

Stephen Hawking once said that success in creating AI would be the most significant event in human history. Unfortunately, it might also be the last unless we learn to avoid the risks. Security is one of the major concerns associated with AI and learning (Köbis and Mehner, 2021 ). Trust-worthy artificial intelligence (AI) in education: Promises and challenges (Petousi and Sifaki, 2020 ; Owoc et al., 2021 ). Most educational institutions nowadays use AI technology in the learning process, and the area attracted researchers and interests. Many researchers agree that AI significantly contributes to e-learning and education (Nawaz et al. 2020 ; Ahmed and Nashat, 2020 ). Their claim is practically proved by the recent COVID-19 pandemic (Torda, 2020 ; Cavus et al., 2021 ). But AI or machine learning also brought many concerns and challenges to the education sector, and security and privacy are the biggest.

No one can deny that AI systems and applications are becoming a part of classrooms and education in one form or another (Sayantani, 2021 ). Each tool works according to its way, and the student and teacher use it accordingly. It creates an immersive learning experience using voices to access information and invites potential privacy and security risks (Gocen and Aydemir, 2020 ). While answering a question related to privacy concerns focuses on student safety as the number one concern of AI devices and usage. The same may go for the teacher’s case as well.

Additionally, teachers know less about the rights, acts, and laws of privacy and security, their impact and consequences, and any violations cost to the students, teachers, and country (Vadapalli, 2021 ). Machine learning or AI systems are purely based on data availability. Without data, it is nothing, and the risk is unavoidable of its misuse and leaks for a lousy purpose (Hübner, 2021 ).

AI systems collect and use enormous data for making predictions and patterns; there is a chance of biases and discrimination (Weyerer and Langer, 2019 ). Many people are now concerned with the ethical attributes of AI systems and believe that the security issue must be considered in AI system development and deployment (Samtani et al., 2021 ). The Facebook-Cambridge Analytica scandal is one of the significant examples of how data collected through technology is vulnerable to privacy concerns. Although much work has been done, as the National Science Foundation recognizes, much more is still necessary (Calif, 2021 ). According to Kurt Markley, schools, colleges, and universities have big banks of student records comprising data related to their health, social security numbers, payment information, etc., and are at risk. It is necessary that learning institutions continuously re-evaluate and re-design the security practices to make the data secure and prevent any data breaches. The trouble is even more in remote learning environments or when information technology is effective (Chan and Morgan, 2019 ).

It is also of importance and concern that in the current era of advanced technology, AI systems are getting more interconnected to cybersecurity due to the advancement of hardware and software (Mengidis et al., 2019 ). This has raised significant concerns regarding the security of various stakeholders and emphasizes the procedures the policymakers must adopt to prevent or minimize the threat (ELever and Kifayat, 2020 ). It is also important to note that security concerns increase with network and endpoints in remote learning. One problem is that protecting e-learning technology from cyber-attacks is neither easy nor requires less money, especially in the education sector, with a limited budget for academic activities (Huls, 2021 ). Another reason this severe threat exists is because of very few technical staff in an educational institution; hiring them is another economic issue. Although, to some extent, using intelligent technology of AI and machine learning, the level and threat of security decrease, again, the issue is that neither every teacher is a professional and trained enough to use the technology nor able to handle the common threats. And as the use of AI in education increases, the danger of security concerns also increases (Taddeo et al., 2019 ). No one can run from the threat of AI concerning cybersecurity, and it behaves like a double-edged sword (Siau and Wang, 2020 ).

Digital security is the most significant risk and ethical concern of using AI in education systems, where criminals hack machines and sell data for other purposes (Venema, 2021 ). We alter our safety and privacy (Sutton et al., 2018 ). The question remains: whether our privacy is secured, and when will AI systems become able to keep our confidentiality connected? The answer is beyond human knowledge (Kirn, 2007 ).

Human interactions with AI are increasing day by day. For example, various AI applications, like robots, chatbots, etc., are used in e-learning and education. Many will learn human-like habits one day, but some human attributes, like self-awareness, consciousness, etc., will remain a dream. AI still needs data and uses it for learning patterns and making decisions; privacy will always remain an issue (Mhlanga, 2021 ). On the one hand, it is a fact that AI systems are associated with various human rights issues, which can be evaluated from case to case. AI has many complex pre-existing impacts regarding human rights because it is not installed or implemented against a blank slate but as a backdrop of societal conditions. Among many human rights that international law assures, privacy is impacted by it (Levin, 2018 ). From the discussed review, we draw the following hypothesis.

H 1 : There is a significant impact of artificial intelligence on the security and privacy issues

AI is a technology that significantly impacts Industry 4.0, transforming almost every aspect of human life and society (Jones, 2014 ). The rising role of AI in organizations and individuals feared the persons like Elon Musk and Stephen Hawking. Who thinks it is possible when AI reaches its advanced level, there is a risk it might be out of control for humans (Clark et al., 2018 ). It is alarming that research increased eight times compared to the other sectors. Most firms and countries invest in capturing and growing AI technologies, skills, and education (Oh et al., 2017 ). Yet the primary concern of AI adoption is that it complicates the role of AI in sustainable value creation and minimizes human control (Noema, 2021 ).

When the usage and dependency of AI are increased, this will automatically limit the human brain’s thinking capacity. This, as a result, rapidly decreases the thinking capacity of humans. This removes intelligence capacities from humans and makes them more artificial. In addition, so much interaction with technology has pushed us to think like algorithms without understanding (Sarwat, 2018 ). Another issue is the human dependency on AI technology in almost every walk of life. Undoubtedly, it has improved living standards and made life easier, but it has impacted human life miserably and made humans impatient and lazy (Krakauer, 2016 ). It will slowly and gradually starve the human brain of thoughtfulness and mental efforts as it gets deep into each activity, like planning and organizing. High-level reliance on AI may degrade professional skills and generate stress when physical or brain measures are needed (Gocen and Aydemir, 2020 ).

AI is minimizing our autonomous role, replacing our choices with its choices, and making us lazy in various walks of life (Danaher, 2018 ). It is argued that AI undermines human autonomy and responsibilities, leading to a knock-out effect on happiness and fulfilment (C. Eric, 2019 ). The impact will not remain on a specific group of people or area but will also encompass the education sector. Teachers and students will use AI applications while doing a task/assignment, or their work might be performed automatically. Progressively, getting an addiction to AI use will lead to laziness and a problematic situation in the future. To summarize the review, the following hypothesis is made:

H 2 : There is a significant impact of artificial intelligence on human laziness

Technology plays an essential role in decision-making. It helps humans use information and knowledge properly to make suitable decisions for their organization and innovations (Ahmad, 2019 ). Humans are producing large volumes of data, and to make it efficient, firms are adopting and using AI and kicking humans out of using the data. Humans think they are getting benefits and saving time by using AI in their decisions. But it is overcoming the human biological processors through lowing cognition capabilities (Jarrahi, 2018 ).

It is a fact that AI technologies and applications have many benefits. Still, AI technologies have severe negative consequences, and the limitation of their role in human decision-making is one of them. Slowly and gradually, AI limits and replaces the human role in decision-making. Human mental capabilities like intuitive analysis, critical thinking, and creative problem-solving are getting out of decision-making (Ghosh et al., 2019 ). Consequently, this will lead to their loss as there is a saying, use it or lose it. The speed of adaptation of AI technology is evident from the usage of AI in the strategic decision-making processes, which has increased from 10 to 80% in five years (Sebastian and Sebastian, 2021 ).

Walmart and Amazon have integrated AI into their recruitment process and make decisions about their product. And it’s getting more into the top management decisions (Libert, 2017 ). Organizations use AI to analyze data and make complex decisions effectively to obtain a competitive advantage. Although AI is helping the decision-making process in various sectors, humans still have the last say in making any decision. It highlights the importance of humans’ role in the process and the need to ensure that AI technology and humans work side by side (Meissner and Keding, 2021 ). The hybrid model of the human–machine collaboration approach is believed to merge in the future (Subramaniam, 2022 ).

The role of AI in decision-making in educational institutions is spreading daily. Universities are using AI in both academic and administrative activities. From students searching for program admission requirements to the issuance of degrees, they are now assisted by AI personalization, tutoring, quick responses, 24/7 access to learning, answering questions, and task automation are the leading roles AI plays in the education sector (Karandish, 2021 ).

In all the above roles, AI collects data, analyzes it, and then responds, i.e., makes decisions. It is necessary to ask some simple but essential questions: Does AI make ethical choices? The answer is AI was found to be racist, and its choice might not be ethical (Tran, 2021 ). The second question is, does AI impact human decision-making capabilities? While using an intelligent system, applicants may submit their records directly to the designer and get approval for admission tests without human scrutiny. One reason is that the authorities will trust the system; the second may be the laziness created by task automation among the leaders.

Similarly, in keeping the records of students and analyzing their data, again, the choice will be dependent on the decision made by the system, either due to trust or due to the laziness created by task automation among the authorities. Almost in every task, the teachers and other workers lose the power of cognition while making academic or administrative decisions. And their dependency increases daily on the AI systems installed in the institution. To summarize the review, in any educational organization, AI makes operations automatic and minimizes staff participation in performing various tasks and making decisions. The teachers and other administrative staff are helpless in front of AI as the machines perform many of their functions. They are losing the skills of traditional tasks to be completed in an educational setting and consequently losing the reasoning capabilities of decision-making.

H 3 : There is a significant impact of artificial intelligence with the loss of human decision making

Conceptual framework

figure 1

The impact of artificial intelligence on human loss in decision making, laziness, and safety in education.

Methodology

Research design.

The research philosophy focuses on the mechanism of beliefs and assumptions regarding knowledge development. It is precisely what the researcher works on while conducting research and mounting expertise in a particular area. In this research, the positivist philosophy of analysis is used. Positivism focuses on an observable social reality that produces the laws, just like generalizations. This philosophy uses the existing theory for hypotheses development in this study.

Furthermore, this philosophy is used because this study is about measurable and quantifiable data. The quantitative method is followed for data collection and analysis in this research. The quantitative practice focuses on quantifiable numbers and provides a systematic approach to assessing incidences and their associations. Moreover, while carrying out this study, the author evaluated the validity and reliability tools to ensure rigor in data. The primary approach is used because the data collected in this research is first-hand, which means it is collected directly from the respondents.

Sample and sampling techniques

The purposive sampling technique was used in this study for the primary data collection. This technique is used because it targets a small number of participants to participate in the survey, and their feedback shows the entire population (Davies and Hughes, 2014 ). Purposive sampling is a recognized non-probabilistic sampling technique because the author chose the participants based on the study’s purpose. The respondents of this study were students at different universities in Pakistan and China. Following the ethical guidelines, consent was taken from the participants. After that, they were asked to give their responses through a questionnaire. The number of participants who took part in the study was 285. This data collection was around two months, from 4 July 2022 to 31 August 2022.

The survey instrument is divided into two parts. The initial portion of the questionnaire comprised demographic questions that included gender, age, country, and educational level. The second portion of the instrument had the Likert scale questions of the latent variables. This study model is composed of four latent variables. All four latent variables are measured through their developed Likert scale questions. All five measures of the latent variables are adopted from the different past studies that have developed and validated these scales. The measures of artificial intelligence are composed of seven items adopted from the study of Suh and Ahn ( 2022 ). The loss measures in decision-making consist of five items adopted from the study of Niese ( 2019 ). The measures of safety and security issues are composed of five items adopted from the study of Youn ( 2009 ). The measure of human laziness comprises four items adopted from the study of Dautov ( 2020 ). All of them are measured on the Likert scale of five, one for the least level of agreement and five for the highest level of agreement. Table 1 shows the details of the items of each construct.

Common method bias

CMB is a major problem faced by the researcher working on the primary survey data research. There are many causes for this dilemma. The primary reason is the response tendency, in which the respondents of the research rate equally to all questions (Jordan and Troth, 2020 ). A model’s VIF values are not limited to multi-collinearity diagnostics but also indicate the common method bias (Kock, 2015 ). If the VIF values of the individual items present in the model are equal to or <3.3, then it is considered that the model is free from the common method bias. Table 2 shows that all the VIF values are <3.3, which indicates that the data collected by the primary survey is almost free from the issues of common bias.

Reliability and validity of the data

Reliability and validity confirm the health of the instrument and survey data for further analysis. Two tools are used in structural equation modeling for reliability: item reliability and construct reliability. The outer loading of each item gauges the item’s reliability. Its threshold value is 0.706, but in some cases, even 0.5 is also acceptable if the basic assumption of the convergent validity is not violated (Hair and Alamer, 2022 ). Cronbach’s Alpha and composite reliability are the most used tools to measure construct reliability. The threshold value is 0.7 (Hair Jr et al., 2021 ). Table 3 shows that all the items of each construct have outer loading values greater than 0.7. Only one item of artificial intelligence and one item of decision making is below 0.7 but within the minimum limit of 0.4, and both AVE values are also good. While each construct Cronbach’s alpha and composite reliability values are >0.7, both measures of reliability, item reliability, and construct reliability are established. For the validity of the data, there are also two measures used one is convergent validity, and the other is discriminant validity. For convergent validity, AVE values are used. The threshold value for the AVE is 0.5 (Hair and Alamer, 2022 ). From the table of reliability and validity, all the constructs have AVE values >0.5, indicating that all the constructs are convergently valid.

In Smart-PLS, three tools are used to measure discriminant validity: the Farnell Larker criteria, HTMT ratios, and the cross-loadings of the items. The threshold value for the Farnell Licker criteria is that the diagonal values of the table must be greater than the values of its corresponding rows and columns. Table 4 shows that all the diagonal values of the square root of the AVE are greater than their corresponding values of both columns and rows. The threshold value for the HTMT values is 0.85 or less (Joe F. Hair Jr et al., 2020 ). Table 5 shows that all the values are less than 0.85. Table 6 shows that they must have self-loading with their construct values greater than the cross-loading with other constructs. Table 6 shows that all the self-loadings are greater than the cross-loadings. All three above measures of discriminant validity show that the data is discriminately valid.

Results and discussion

Demographic profile of the respondents.

Table 7 shows the demographic characteristics of the respondents. Among 285 respondents, 164 (75.5%) are male, while 121 (42.5%) are female. The data was collected from different universities in China and Pakistan. The table shows that 142 (50.2%) are Chinese students, and 141 (49.8%) are Pakistani students. The age group section shows that the students are divided into three age groups, <20 years, 20–25 years, and 26 years and above. Most students belong to the age group 20–25 years, which is 140 (49.1%), while 26 (9.1%) are <20 years old and 119 (41.8%) are 26 years and above. The fourth and last section of the table shows the program of the student’s studies. According to this, 149 (52.3%) students are undergraduates, 119 (41.8%) are graduates, and 17 (6%) are post-graduates.

Structural model

The structural model explains the relationships among study variables. The proposed structural model is exhibited in Fig. 2 .

figure 2

Results model for the Impact of artificial intelligence on human loss in decision-making, laziness, and safety in education.

Regression analysis

Table 8 shows the total direct relationships in the model. The first direct relationship is between artificial intelligence to loss in human decision-making, with a beta value of 0.277. The beta value shows that one unit increase in artificial intelligence will lose human decision-making by 0.277 units among university students in Pakistan and China. This relationship having the t value of 5.040, greater than the threshold value of 1.96, and a p -value of 0.000, <0.05, shows that the relationship is statistically significant. The second relationship is between artificial intelligence the human laziness. The beta value for this relationship is 0.689, which shows that one unit increase in artificial intelligence will make the students of Pakistan and China universities lazy by 0.689 units. The t -value for the relationship is 23.257, which is greater than the threshold value of 1.96, and a p -value of 0.000, which is smaller than the threshold value of 0.05, which shows that this relationship is also statistically significant. The third and last relationship is from artificial intelligence to security and privacy issues of Pakistani and Chinese university students. The beta value for this relationship is 0.686, which shows that a one-unit increase in artificial intelligence will increase security and privacy issues by 0.686. The t -value for the relationship is 17.105, which is greater than the threshold value of 1.96, and the p -value is 0.000, which is smaller than a threshold value of 0.05, indicating that this relationship is also statistically significant.

Hypothesis testing

Table 8 also indicates that the results support all three hypotheses.

Model fitness

Once the reliability and validity of the measurement model are confirmed, the structural model fitness must be assessed in the next step. For the model fitness, several measures are available in the SmartPLS, like SRMR, Chi-square, NFI, etc., but most of the researcher recommends the SRMR for the model fitness in the PLS-SEM. When applying PLS-SEM, a value <0.08 is generally considered a good fit (Hu and Bentler, 1998 ). However, the table of model fitness shows that the SRMR value is 0.06, which is less than the threshold value of 0.08, which indicates that the model is fit.

Predictive relevance of the model

Table 9 shows the model’s prediction power, as we know that the model has total dependent variables. Then there are three predictive values for the model for each variable. The threshold value for predicting the model power is greater than zero. However, Q 2 values of 0.02, 0.15, and 0.35, respectively, indicate that an independent variable of the model has a low, moderate, or high predictive relevance for a certain endogenous construct (Hair et al., 2013 ). Human laziness has the highest predictive relevance, with a Q 2 value of 0.338, which shows a moderate effect. Safety and security issues have the second largest predictive relevance with the Q 2 value of 0.314, which also show a moderate effect. The last and smallest predictive relevance in decision-making with a Q 2 value of 0.033 which shows a low effect. A greater Q 2 value shows that the variable or model has the highest prediction power.

Importance performance matrix analysis (IPMA)

Table 10 shows the importance and performance of each independent variable for the dependent variables. We see that artificial intelligence has the same performance of 68.78% for all three variables: human laziness, decision-making, safety, and security. While the importance of artificial intelligence, human laziness is 68.9%, loss in decision-making is 25.1%, and safety and security are 74.6%. This table shows that safety and privacy have the highest importance, and their performance is recommended to be increased to meet the important requirements. Figures 3 – 5 also show all three variables’ importance compared to performance with artificial intelligence.

figure 3

Importance-performance map—human loss in decision making and artificial intelligence.

figure 4

Importance-performance map—human laziness and artificial intelligence.

figure 5

Importance-performance map—safety and privacy and artificial intelligence.

Multi-group analysis (MGA)

Multigroup analysis is a technique in structural equation modeling that compares the effects of two classes of categorical variables on the model’s relationships. The first category is gender, composed of male and female subgroups or types. Table 10 shows the gender comparison for all three relationships. The data record shows that there were 164 males and 121 females. The p -values of all three relationships are >0.05, which shows that gender is not moderate in any of the relationships. Table 10 shows the country-wise comparison for all three relationships in the model. The p -values of all three relationships are >0.05, indicating no moderating effect of the country on all three relationships. The data records show 143 Pakistanis and 142 Chinese based on the country’s origin.

AI is becoming an increasingly important element of our lives, with its impact felt in various aspects of our daily life. Like any other technological advancement, there are both benefits and challenges. This study examined the association of AI with human loss in decision-making, laziness and safety and privacy concerns. The results given Tables 11 and 12 show that AI has a significant positive relationship with all these variables. The findings of this study also support that the use of AI technologies is creating problems for users related to security and privacy. Previous research has also shown similar results (Bartoletti, 2019 ; Saura et al., 2022 ; Bartneck et al., 2021 ). Using AI technology in an educational organization also leads to security and privacy issues for students, teachers, and institutions. In today’s information age, security and privacy are critical concerns of AI technology use in educational organizations (Kamenskih, 2022 ). Skills specific to using AI technology are required for its effective use. Insufficient knowledge about the use will lead to security and privacy issues (Vazhayil and Shetty, 2019 ). Mostly, educational firms do not have AI technology experts in managing it, which again increases its vulnerability in the context of security and privacy issues. Even if its users have sound skills and the firms have experienced AI managers, no one can deny that any security or privacy control could be broken by mistake and could lead to serious security and privacy problems. Moreover, the fact that people with different levels of skills and competence interact in educational organizations also leads to the hacking or leaking of personal and institutional data (Kamenskih, 2022 ). AI is based on algorithms and uses large data sets to automate instruction (Araujo et al., 2020 ). Any mistake in the algorithms will create serious problems, and unlike humans, it will repeat the same mistake in making its own decisions. It also increases the threat to institutional and student data security and privacy. The same challenge is coming from the student end. They can be easily victimized as they are not excellently trained to use AI (Asaro, 2019 ). With the increase in the number of users, competence division and distance, safety and privacy concerns increase (Lv and Singh, 2020 ). The consequences depend upon the nature of the attack and the data been leaked or used by the attackers (Vassileva, 2008 ).

The findings show that AI-based products and services are increasing the human laziness factor among those relying more on AI. However, there were not too many studies conducted on this factor by the researcher in the past, but the numerous researchers available in the literature also endorse the findings of this study (Farrow, 2022 ; Bartoletti, 2019 ). AI in education leads to the creation of laziness in humans. AI performs repetitive tasks in an automated manner and does not let humans memorize, use analytical mind skills, or use cognition (Nikita, 2023 ). It leads to an addiction behavior not to use human capabilities, thus making humans lazy. Teachers and students who use AI technology will slowly and gradually lose interest in doing tasks themselves. This is another important concern of AI in the education sector (Crispin Andrews). The teachers and students are getting lazy and losing their decision-making abilities as much of the work is assisted or replaced by AI technology (BARON, 2023 ). Posner and Fei-Fei ( 2020 ) suggested it is time to change AI for education.

The findings also show that the access use of AI will gradually lead to the loss of human decision-making power. The results also endorsed the statement that AI is one of the major causes of the human loss of decision-making power. Several researchers from the past have also found that AI is a major cause responsible for the gradual loss of people’s decision-making (Pomerol, 1997 ; Duan et al., 2019 ; Cukurova et al., 2019 ). AI performs repetitive tasks in an automated manner and does not let humans memorize, use analytical mind skills, or use cognition, leading to the loss of decision-making capabilities (Nikita, 2023 ). An online environment for education can be a good option (VanLangen, 2021 ), but the classroom’s physical environment is the prioritized education mode (Dib and Adamo, 2014 ). In a real environment, there is a significant level of interaction between the teacher and students, which develop the character and civic bases of the students, e.g., students can learn from other students, ask teachers questions, and even feel the education environment. Along with the curriculum, they can learn and adopt many positive understandings (Quinlan et al., 2014 ). They can learn to use their cognitive power to choose options, etc. But unfortunately, the use of AI technology minimizes the real-time physical interaction (Mantello et al., 2021 ) and the education environment between students and teachers, which has a considerable impact on students’ schooling, character, civic responsibility, and their power to make decisions, i.e., use their cognition. AI technology reduces the cognitive power of humans who make their own decisions (Hassani and Unger, 2020 ).

AI technology has undoubtedly transformed or at least affected many fields (IEEE, 2019 ; Al-Ansi and Al-Ansi, 2023 ). Its applications have been developed for the benefit of humankind (Justin and Mizuko, 2017 ). As technology assists employees in many ways, they must be aware of the pros and cons of the technology and must know its applications in a particular field (Nadir et al., 2012 ). Technology and humans are closely connected; the success of one is strongly dependent on the other; therefore, there is a need to ensure the acceptance of technology for human welfare (Ho et al., 2022 ). Many researchers have discussed the user’s perception of a technology (Vazhayil and Shetty, 2019 ), and many have emphasized its legislative and regulatory issues (Khan et al., 2014 ). Therefore, careful selection is necessary to adopt or implement any technology (Ahmad and Shahid, 2015 ). Once imagined in films, AI now runs a significant portion of the technology, i.e., health, transport, space, and business. As AI enters the education sector, it has been affected to a greater extent (Hübner, 2021 ). AI further strengthened its role in education, especially during the recent COVID-19 pandemic, and invaded the traditional way of teaching by providing many opportunities to educational institutions, teachers, and students to continue their educational processes (Štrbo, 2020 ; Al-Ansi, 2022 ; Akram et al., 2021 ). AI applications/technology like chatbots, virtual reality, personalized learning systems, social robots, tutoring systems, etc., assist the educational environment in facing modern-day challenges and shape education and learning processes (Schiff, 2021 ). In addition, it is also helping with administrative tasks like admission, grading, curriculum setting, and record-keeping, to name a few (Andreotta and Kirkham, 2021 ). It can be said that AI is likely to affect, enter and shape the educational process on both the institutional and student sides to a greater extent (Xie et al., 2021 ). This phenomenon hosts some questions regarding the ethical concerns of AI technology, its implementation, and its impact on universities, teachers, and students.

The study has similar findings to the report published by the Harvard Kennedy School, where AI concerns like privacy, automation of tasks, and decision-making are discussed. It says that AI is not the solution to government problems but helps enhance efficiency. It is important to note that the report does not deny the role of AI but highlights the issues. Another study says that AI-based and human decisions must be combined for more effective decisions. i.e., the decisions made by AI must be evaluated and checked, and the best will be chosen by humans from the ones recommended by AI (Shrestha et al., 2019 ). The role of AI cannot be ignored in today’s technological world. It assists humans in performing complex tasks, providing solutions to many complex problems, assisting in decision-making, etc. But on the other hand, it is replacing humans, automating tasks, etc., which creates challenges and demands for a solution (Duan et al., 2019 ). People are generally concerned about risks and have conflicting opinions about the fairness and effectiveness of AI decision-making, with broad perspectives altered by individual traits (Araujo et al. 2020 ).

There may be many reasons for these controversial findings, but the cultural factor was considered one of the main factors (Elliott, 2019 ). According to researchers, people with high cultural values have not adopted the AI problem, so this cultural constraint remains a barrier for the AI to influence their behaviors (Di Vaio et al., 2020 ; Mantelero, 2018 ). The other thing is that privacy is a term that has a different meaning from culture to culture (Ho et al., 2022 ). In some cultures, people consider minimal interference in personal life a big privacy issue, while in some cultures, people even ignore these types of things (Mantello et al., 2021 ). The results are similar to Zhang et al. ( 2022 ), Aiken and Epstein ( 2000 ), and Bhbosale et al. ( 2020 ), which focus on the ethical issues of AI in education. These studies show that AI use in education is the reason for laziness among students and teachers. In short, the researchers are divided on the AI concerns in education, just like in other sectors. But they agree on the positive role AI plays in education. AI in education leads to laziness, loss of decision-making capabilities, and security or privacy issues. But all these issues can be minimized if AI is properly implemented, managed, and used in education.

Implications

The research has important implications for technology developers, the organization that adopts the technology, and the policymakers. The study highlights the importance of addressing ethical concerns during AI technology’s development and implementation stage. It also provides guidelines for the government and policymakers regarding the issues arising with AI technology and its implementation in any organization, especially in education. AI can revolutionize the education sector, but it has some potential drawbacks. Implications suggest that we must be aware of the possible impact of AI on laziness, decision-making, privacy, and security and that we should design AI systems that have a very minimal impact.

Managerial Implications

Those associated with the development and use of AI technology in education need to find out the advantages and challenges of AI in this sector and balance these advantages with the challenges of laziness, decision-making, and privacy or security while protecting human creativity and intuition. AI systems should be designed to be transparent and ethical in all manners. Educational organizations should use AI technology to assist their teachers in their routine activities, not to replace them.

Theoretical Implications

A loss of human decision-making capacity is one of the implications of AI in education. Since AI systems are capable of processing enormous amounts of data and producing precise predictions, there is a risk that humans would become overly dependent on AI in making decisions. This may reduce critical thinking and innovation for both students and teachers, which could lower the standard of education. Educators should be aware of how AI influences decision-making processes and must balance the benefits of AI with human intuition and creativity. AI may potentially affect school security. AI systems can track student behavior, identify potential dangers, and identify situations where children might require more help. There are worries that AI could be applied to unjustly target particular student groups or violate students’ privacy. Therefore, educators must be aware of the potential ethical ramifications of AI and design AI systems that prioritize security and privacy for users and educational organizations. AI makes people lazier is another potential impact on education. Teachers and learners may become more dependent on AI systems and lose interest in performing activities or learning new skills or methodologies. This might lead to a decline in educational quality and a lack of personal development among people. Therefore, teachers must be aware of the possible detrimental impacts of AI on learners’ motivation and should create educational environments that motivate them to participate actively in getting an education.

AI can significantly affect the education sector. Though it benefits education and assists in many academic and administrative tasks, its concerns about the loss of decision-making, laziness, and security may not be ignored. It supports decision-making, helps teachers and students perform various tasks, and automates many processes. Slowly and gradually, AI adoption and dependency in the education sector are increasing, which invites these challenges. The results show that using AI in education increases the loss of human decision-making capabilities, makes users lazy by performing and automating the work, and increases security and privacy issues.

Recommendations

The designer’s foremost priority should be ensuring that AI will not cause any ethical concerns in education. Realistically, it is impossible, but at least severe ethical problems (both individual and societal) can be minimized during this phase.

AI technology and applications in education need to be backed by solid and secure algorithms that ensure the technology’s security, privacy, and users.

Bias behavior of AI must be minimized, and issues of loss of human decision-making and laziness must be addressed.

Dependency on AI technology in decision-making must be reduced to a certain level to protect human cognition.

Teachers and students should be given training before using AI technology.

Future work

Research can be conducted to study the other concerns of AI in education which were not studied.

Description and enumeration of the documents under analysis.

Procedure for the analysis of documents. Discourse analysis and categorization.

Similar studies can be conducted in other geographic areas and countries.

Limitations

This study is limited to three basic ethical concerns of AI: loss of decision-making, human laziness, and privacy and security. Several other ethical concerns need to be studied. Other research methodologies can be adopted to make it more general.

Data availability

The data set generated during and/or analyzed during the current study is submitted as supplementary file and can also be obtained from the corresponding author upon reasonable request.

Change history

29 june 2023.

A Correction to this paper has been published: https://doi.org/10.1057/s41599-023-01842-4

Ahmad (2019) Knowledge management as a source of innovation in public sector. Indian J Nat Sci 9(52):16908–16922

Google Scholar  

Ade-Ibijola A, Young K, Sivparsad N, Seforo M, Ally S, Olowolafe A, Frahm-Arp M(2022) Teaching Students About Plagiarism Using a Serious Game (Plagi-Warfare): Design and Evaluation Study. JMIR Serious Games 10(1):e33459. https://doi.org/10.2196/33459

Article   PubMed   PubMed Central   Google Scholar  

Ahmad SF, Shahid MK (2015) Factors influencing the process of decision making in telecommunication sector. Eur J Bus Manag 7(1):106–114

Ahmad SF, Shahid MK (2015) Role of knowledge management in achieving customer relationship and its strategic outcomes. Inf Knowl Manag 5(1):79–87

Ahmad SF, Ibrahim M, Nadeem AH (2021) Impact of ethics, stress and trust on change management in public sector organizations. Gomal Univ J Res 37(1):43–54

Article   Google Scholar  

Ahmed S, Nashat N (2020) Model for utilizing distance learning post COVID-19 using (PACT)™ a cross sectional qualitative study. Research Square, pp. 1–25. https://doi.org/10.21203/rs.3.rs-31027/v1

Aiken R, Epstein R (2000) Ethical guidelines for AI in education: starting a conversation. Int J Artif Intell Educ 11:163–176

Akram H, Yingxiu Y, Al-Adwan AS, Alkhalifah A (2021) Technology Integration in Higher Education During COVID-19: An Assessment of Online Teaching Competencies Through Technological Pedagogical Content Knowledge Model. Frontiers in Psychol, 12. https://doi.org/10.3389/fpsyg.2021.736522

Al-Ansi A (2022) Investigating Characteristics of Learning Environments During the COVID-19 Pandemic: A Systematic Review. Canadian Journal of Learning and Technology, 48(1). https://doi.org/10.21432/cjlt28051

Al-Ansi AM, Al-Ansi A-A (2023) An Overview of Artificial Intelligence (AI) in 6G: Types, Advantages, Challenges and Recent Applications. Buletin Ilmiah Sarjana Teknik Elektro, 5(1)

Andreotta AJ, Kirkham N (2021) AI, big data, and the future of consent. AI Soc. https://doi.org/10.1007/s00146-021-01262-5

Araujo T, Helberger N, Kruikemeier S, Vreese C (2020) In AI we trust? Perceptions about automated decision-making by artificial intelligence. AI Soc 35(6). https://doi.org/10.1007/s00146-019-00931-w

Asaro PM (2019) AI ethics in predictive policing: from models of threat to an ethics of care. IEEE Technol Soc Mag 38(2):40–53. https://doi.org/10.1109/MTS.2019.2915154

Article   MathSciNet   Google Scholar  

Ayling J, Chapman A (2022) Putting AI ethics to work: are the tools fit for purpose? AI Eth 2:405–429. https://doi.org/10.1007/s43681-021-00084-x

BARON NS (2023) Even kids are worried ChatGPT will make them lazy plagiarists, says a linguist who studies tech’s effect on reading, writing and thinking. Fortune. https://fortune.com/2023/01/19/what-is-chatgpt-ai-effect-cheating-plagiarism-laziness-education-kids-students/

Bartneck C, Lütge C, Wagner A, Welsh S (2021) Privacy issues of AI. In: An introduction to ethics in robotics and AI. Springer International Publishing, pp. 61–70

Bartoletti I (2019) AI in healthcare: ethical and privacy challenges. In: Artificial Intelligence in Medicine: 17th Conference on Artificial Intelligence in Medicine, AIME 2019. Springer International Publishing, Poznan, Poland, pp. 7–10

Bhbosale S, Pujari V, Multani Z (2020) Advantages and disadvantages of artificial intellegence. Aayushi Int Interdiscip Res J 77:227–230

Calif P (2021) Education industry at higher risk for IT security issues due to lack of remote and hybrid work policies. CISION

Cavus N, Mohammed YB, Yakubu MN (2021) Determinants of learning management systems during COVID-19 pandemic for sustainable education. Sustain Educ Approaches 13(9):5189. https://doi.org/10.3390/su13095189

Chan L, Morgan I, Simon H, Alshabanat F, Ober D, Gentry J, ... & Cao R (2019) Survey of AI in cybersecurity for information technology management. In: 2019 IEEE technology & engineering management conference (TEMSCON). IEEE, Atlanta, pp. 1–8

Clark J, Shoham Y, Perrault R, Erik Brynjolfsson, Manyika J, Niebles JC, Lyons T, Etchemendy J, Grosz B and Bauer Z (2018) The AI Index 2018 Annual Report, AI Index Steering Committee, Human-Centered AI Initiative, Stanford University, Stanford, CA

Cukurova M, Kent C, Luckin R (2019) Artificial intelligence and multimodal data in the service of human decision‐making: a case study in debate tutoring. Br J Educ Technol 50(6):3032–3046. https://doi.org/10.1111/bjet.12829

Danaher J (2018) Toward an ethics of AI assistants: an initial framework. Philos Technol 31(3):1–15. https://doi.org/10.1007/s13347-018-0317-3

Dastin J (2018) Amazon scraps secret AI recruiting tool that showed bias against women. (Reuters) Retrieved from https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

Dautov D (2020) Procrastination and laziness rates among students with different academic performance as an organizational problem. In: E3S web of conferences, pp. 1–10

Davies MB, Hughes N (eds) (2014) Doing a Successful Research Project: Using Qualitative or Quantitative Methods, 2nd edn. Palgrave MacMillan, Basingstoke, Hampshire, p 288

Di Vaio A, Palladino R, Hassan R, Escobar O (2020) Artificial intelligence and business models in the sustainable development goals perspective: a systematic literature review. J Bus Res 121:283–314. https://doi.org/10.1016/j.jbusres.2020.08.019

Dib H, Adamo N (2014) An interactive virtual environment to improve undergraduate students’ competence in surveying mathematics. In: 2014 International conference on computing in civil and building engineering

Duan Y, Edwards JS, Dwivedi YK (2019) Artificial intelligence for decision making in the era of Big Data—evolution, challenges and research agenda. Int J Inf Manag 48:63–71. https://doi.org/10.1016/j.ijinfomgt.2019.01.021

ELever K, Kifayat K (2020) Identifying and mitigating security risks for secure and robust NGI networks. Sustain Cities Soc 59. https://doi.org/10.1016/j.scs.2020.102098

Elliott A (2019) The culture of AI: everyday life and the digital revolution. Routledge

Eric C (2019) What AI-driven decision making looks like. Harv Bus Rev. https://hbr.org/2019/07/what-ai-driven-decision-making-looks-like

Farrow E (2022) Determining the human to AI workforce ratio—exploring future organisational scenarios and the implications for anticipatory workforce planning. Technol Soc 68(101879):101879. https://doi.org/10.1016/j.techsoc.2022.101879

Fjelland R (2020) Why general artificial intelligence will not be realized. Humanit Soc Sci Commun 7(10):1–9. https://doi.org/10.1057/s41599-020-0494-4

Ghosh B, Daugherty PR, Wilson HJ (2019) Taking a systems approach to adopting AI. Harv Bus Rev. https://hbr.org/2019/05/taking-a-systems-approach-to-adopting-ai

Gocen A, Aydemir F (2020) Artificial intelligence in education and schools. Res Educ Media 12(1):13–21. https://doi.org/10.2478/rem-2020-0003

Hair J, Alamer A (2022) Partial Least Squares Structural Equation Modeling (PLS-SEM) in second language and education research: guidelines using an applied example. Res Methods Appl Linguist 1(3):100027. https://doi.org/10.1016/j.rmal.2022.100027

Hair Jr JF, Ringle CM, Sarstedt M (2013) Partial least squares structural equation modeling: rigorous applications, better results and higher acceptance. Long Range Plan 46(1–2):1–12. https://doi.org/10.1016/j.lrp.2013.01.001

Hair Jr JF, Howard MC, Nitzl C (2020) Assessing measurement model quality in PLS-SEM using confirmatory composite analysis. J Bus Res 109:101–110. https://doi.org/10.1016/j.jbusres.2019.11.069

Hair JF Jr, Hult GTM, Ringle CM, Sarstedt M, Danks NP, Ray S, ... & Ray S (2021) An introduction to structural equation modeling. Partial Least Squares Structural Equation Modeling (PLS-SEM) Using R: A Workbook. In: Classroom companion: business. Springer International Publishing, pp. 1–29

Hassani H, Unger S (2020) Artificial intelligence (AI) or intelligence augmentation (IA): what is the future? AI 1(2). https://doi.org/10.3390/ai1020008

Hax C (2018) Carolyn Hax live chat transcripts from 2018. Washington, Columbia, USA. Retrieved from https://www.washingtonpost.com/advice/2018/12/31/carolyn-hax-chat-transcripts-2018/

Ho M-T, Mantello P, Ghotbi N, Nguyen M-H, Nguyen H-KT, Vuong Q-H (2022) Rethinking technological acceptance in the age of emotional AI: surveying Gen Z (Zoomer) attitudes toward non-conscious data collection. Technol Soc 70(102011):102011. https://doi.org/10.1016/j.techsoc.2022.102011

Holmes W, Bialik M, Fadel C (2019) Artificial intelligence in education. Promise and implications for teaching and learning. Center for Curriculum Redesign

Hu L-T, Bentler PM (1998) Fit indices in covariance structure modeling: Sensitivity to underparameterized model misspecification. Psychol Methods 3(4):424–453. https://doi.org/10.1037/1082-989x.3.4.424

Hübner D (2021) Two kinds of discrimination in AI-based penal decision-making. ACM SIGKDD Explor Newsl 23:4–13. https://doi.org/10.1145/3468507.3468510

Huls A (2021) Artificial intelligence and machine learning play a role in endpoint security. N Y Times. Retrieved from https://www.nastel.com/artificial-intelligence-and-machine-learning-play-a-role-in-endpoint-security/

IEEE (2019) A vision for prioritizing human well-being with autonomous and intelligent systems. https://standards.ieee.org/wp-content/uploads/import/documents/other/ead_v2.pdf

Jarrahi MH (2018) Artificial intelligence and the future of work: human–AI symbiosis in organizational decision making. Bus Horiz 61(4):1–15. https://doi.org/10.1016/j.bushor.2018.03.007

Jones RC (2014) Stephen Hawking warns artificial intelligence could end mankind. BBC News

Jordan P, Troth A (2020) Common method bias in applied settings: the dilemma of researching in organizations. Aust J Manag 45(1):2–14. 10.1177%2F0312896219871976

Justin R, Mizuko I (2017) From good intentions to real outcomes: equity by design in. Digital Media and Learning Research Hub, Irvine, https://clalliance.org/wp-content/uploads/2017/11/GIROreport_1031.pdf

Kamenskih A(2022) The analysis of security and privacy risks in smart education environments. J Smart Cities Soc 1(1):17–29. https://doi.org/10.3233/SCS-210114

Karandish D (2021) 7 Benefits of AI in education. The Journal. https://thejournal.com/Articles/2021/06/23/7-Benefits-of-AI-in-Education.aspx

Khan GA, Shahid MK, Ahmad SF (2014) Convergence of broadcasting and telecommunication technology regulatory framework in Pakistan. Int J Manag Commer Innov 2(1):35–43

Kirn W (2007) Here, there and everywhere. NY Times. https://www.nytimes.com/2007/02/11/magazine/11wwlnlede.t.html

Köbis L, Mehner C (2021) Ethical questions raised by AI-supported mentoring in higher education. Front Artif Intell 4:1–9. https://doi.org/10.3389/frai.2021.624050

Kock N (2015) Common method bias in PLS-SEM: a full collinearity assessment approach. Int J e-Collab 11(4):1–10. https://doi.org/10.4018/ijec.2015100101

Krakauer D (2016) Will AI harm us? Better to ask how we’ll reckon with our hybrid nature. Nautilus. http://nautil.us/blog/will-ai-harm-us-better-to-ask-how-well-reckon-withour-hybrid-nature . Accessed 29 Nov 2016

Landwehr C (2015) We need a building code for building code. Commun ACM 58(2):24–26. https://doi.org/10.1145/2700341

Leeming J (2021) How AI is helping the natural sciences. Nature 598. https://www.nature.com/articles/d41586-021-02762-6

Levin K (2018) Artificial intelligence & human rights: opportunities & risks. Berkman Klein Center for Internet & Society Research Publication. https://dash.harvard.edu/handle/1/38021439

Libert B, Beck M, Bonchek M (2017) AI in the boardroom: the next realm of corporate governance. MIT Sloan Manag Rev. 5. https://sloanreview.mit.edu/article/ai-in-the-boardroom-the-next-realm-of-corporate-governance/

Lv Z, Singh AK (2020) Trustworthiness in industrial IoT systems based on artificial intelligence. IEEE Trans Ind Inform 1–1. https://doi.org/10.1109/TII.2020.2994747

Maini V, Sabri S (2017) Machine learning for humans. https://medium.com/machine-learning-for-humans

Mantelero A (2018) AI and Big Data: a blueprint for a human rights, social and ethical impact assessment. Comput Law Secur Rep 34(4):754–772. https://doi.org/10.1016/j.clsr.2018.05.017

Mantello P, Ho M-T, Nguyen M-H, Vuong Q-H (2021) Bosses without a heart: socio-demographic and cross-cultural determinants of attitude toward Emotional AI in the workplace. AI Soc 1–23. https://doi.org/10.1007/s00146-021-01290-1

McStay A (2020) Emotional AI and EdTech: serving the public good? Learn Media Technol 45(3):270–283. https://doi.org/10.1080/17439884.2020.1686016

Meissner P, Keding C (2021) The human factor in AI-based decision-making. MIT Sloan Rev 63(1):1–5

Mengidis N, Tsikrika T, Vrochidis S, Kompatsiaris I (2019) Blockchain and AI for the next generation energy grids: cybersecurity challenges and opportunities. Inf Secur 43(1):21–33. https://doi.org/10.11610/isij.4302

Mhlanga D (2021) Artificial intelligence in the industry 4.0, and its impact on poverty, innovation, infrastructure development, and the sustainable development goals: lessons from emerging economies? Sustainability 13(11):57–88. https://doi.org/10.3390/su13115788

Nadir K, Ahmed SF, Ibrahim M, Shahid MK (2012) Impact of on-job training on performance of telecommunication industry. J Soc Dev Sci 3(2):47–58

Nakitare J, Otike F (2022). Plagiarism conundrum in Kenyan universities: an impediment to quality research. Digit Libr Perspect. https://doi.org/10.1108/dlp-08-2022-0058

Nawaz N, Gomes AM, Saldeen AM (2020) Artificial intelligence (AI) applications for library services and resources in COVID-19 pandemic. J Crit Rev 7(18):1951–1955

Nemorin S, Vlachidis A, Ayerakwa HM, Andriotis P (2022) AI hyped? A horizon scan of discourse on artificial intelligence in education (AIED) and development. Learn Media Technol 1–14. https://doi.org/10.1080/17439884.2022.2095568

Niese B (2019) Making good decisions: an attribution model of decision quality in decision tasks. Kennesaw State University. https://digitalcommons.kennesaw.edu/cgi/viewcontent.cgi?article=1013&context=phdba_etd

Nikita (2023) Advantages and Disadvantages of Artificial Intelligence. Simplilearn. https://www.simplilearn.com/advantages-and-disadvantages-of-artificial-intelligence-article

Noema (2021) AI makes us less intelligent and more artificial. https://www.noemamag.com/Ai-Makes-Us-Less-Intelligent-And-More-Artificial/

Oh C, Lee T, Kim Y (2017) Us vs. them: understanding artificial intelligence technophobia over the Google DeepMind challenge match. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2523–2534

Owoc ML, Sawicka A, Weichbroth P (2021) Artificial intelligence technologies in education: benefits, challenges and strategies of implementation. In: Artificial Intelligence for Knowledge Management: 7th IFIP WG 12.6 International Workshop, AI4KM 2019, Held at IJCAI 2019, Macao, China, August 11, 2019, Revised Selected Papers. Springer International Publishing, Cham, pp. 37–58

Petousi V, Sifaki E (2020) Contextualizing harm in the framework of research misconduct. Findings from discourse analysis of scientific publications. Int J Sustain Dev 23(3-4):149–174. https://doi.org/10.1504/IJSD.2020.10037655

Pomerol J-C (1997) Artificial intelligence and human decision making. Eur J Oper Res 99(1):3–25. https://doi.org/10.1016/s0377-2217(96)00378-5

Article   MATH   Google Scholar  

Posner T, Fei-Fei L (2020) AI will change the world, so it’s time to change A. Nature S118–S118. https://doi.org/10.1038/d41586-020-03412-z

Quinlan D, Swain N, Cameron C, Vella-Brodrick D (2014) How ‘other people matter’ in a classroom-based strengths intervention: exploring interpersonal strategies and classroom outcomes. J Posit Psychol 10(1):1–13. https://doi.org/10.1080/17439760.2014.920407

Rainie L, Anderson J, Vogels EA (2021) Experts doubt ethical AI design will be broadly adopted as the norm within the next decade. Pew Research Center, Haley Nolan

Rasheed I, Subhan I, Ibrahim M, Sayed FA (2015) Knowledge management as a strategy & competitive advantage: a strong influence to success, a survey of knowledge management case studies of different organizations. Inf Knowl Manag 5(8):60–71

Rosé CP, Martínez-Maldonado R, Hoppe HU, Luckin R, Mavrikis M, Porayska-Pomsta K, ... & Du Boulay B (Eds.) (2018) Artificial intelligence in education. 19th International conference, AIED 2018. Springer, London, UK

Ross J (2021) Does the rise of AI spell the end of education? https://www.timeshighereducation.com/features/does-rise-ai-spell-end-education

Samtani S, Kantarcioglu M, Chen H (2021) A multi-disciplinary perspective for conducting artificial intelligence-enabled privacy analytics: connecting data, algorithms, and systems. ACM Trans Manag Inf Syst 12:1–18. https://doi.org/10.1145/3447507

Sarwat (2018) Is AI making humans lazy? Here’s what UAE residents say. Khaleej Times. https://www.khaleejtimes.com/nation/dubai/Is-AI-making-humans-lazy-Here-what-UAE-residents-say

Saura JR, Ribeiro-Soriano D, Palacios-Marqués D (2022) Assessing behavioral data science privacy issues in government artificial intelligence deployment. Government Inf Q 39(4):101679. https://doi.org/10.1016/j.giq.2022.101679

Sayantani (2021) Is artificial intelligence making us lazy and impatient? San Jose, CA, USA. Retrieved from https://industrywired.com/Is-Artificial-Intelligence-Making-Us-LazyAnd-Impatient/

Sayed FA, Muhammad KS (2015) Impact of knowledge management on security of information. Inf Knowl Manag 5(1):52–59

Sayed FA, Mohd KR, Muhammad SM, Muhammad MA, Syed IH (2021) Artificial intelligence and its role in education. Sustainability 13:1–11. https://doi.org/10.3390/su132212902

Schiff D (2021) Out of the laboratory and into the classroom: the future of artificial intelligence in education. AI Soc. https://doi.org/10.1007/s00146-020-01033-8

Sebastian R, Sebastian K (2021) Artificial intelligence and management: the automation–augmentation paradox. Acad Manag Rev 46(1):192–210. https://doi.org/10.5465/amr.2018.0072

Shneiderman B (2021) Responsible AI: bridging from ethics to practice. Commun ACM 64(8):32–35. https://doi.org/10.1145/3445973

Shrestha YR, Ben-Menahem SM, von Krogh G(2019) Organizational Decision-Making Structures in the Age of Artificial Intelligence. California Manag Rev 61(4):66–83. https://doi.org/10.1177/0008125619862257

Siau K, Wang W (2020) Artificial intelligence (AI) ethics: ethics of AI and ethical AI. J Database Manag 31(2):74–87. https://doi.org/10.4018/JDM.2020040105

Stahl BC (2021b) Artificial intelligence for a better future. Springer, Leicester, UK

Book   Google Scholar  

Stahl A (2021a) How AI will impact the future of work and life. https://www.forbes.com/sites/ashleystahl/2021/03/10/how-ai-will-impact-the-future-of-work-and-life/?sh=3c5c1a5279a3

Štrbo M (2020) AI based smart teaching process during the Covid-19 pandemic. In: 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS). IEEE, Thoothukudi, pp. 402–406

Subramaniam M (2022) How smart products create connected customers. MIT Sloan Manag Rev. https://sloanreview.mit.edu/

Suh W, Ahn S (2022) Development and validation of a scale measuring student attitudes toward artificial intelligence. Sage Open 1–12. https://doi.org/10.1177/21582440221100463

Sutton S, Arnold V, Holt M (2018) How much automation is too much? Keeping the human relevant in knowledge work. J Emerg Technol Account 15(2). https://doi.org/10.2308/jeta-52311

Taddeo M, McCutcheon T, Floridi L (2019) Trusting artificial intelligence in cybersecurity is a double-edged sword. Nat Mach Intell 1(12). https://doi.org/10.1038/s42256-019-0109-1

Tahiru F (2021) AI in education: a systematic literature review. J Cases Inf Technol 23(1):1–20. https://doi.org/10.4018/JCIT.2021010101

Tarran B (2018) What can we learn from the Facebook–Cambridge Analytica scandal? Significance 15(3):4–5. https://doi.org/10.1111/j.1740-9713.2018.01139.x

Topcu U, Zuck L (2020) Assured autonomy: path toward living with autonomous systems we can trust. Computers and Society. Cornell University. https://doi.org/10.48550/arXiv.2010.14443

Torda A (2020) How COVID-19 has pushed us into a medical education revolution. Intern Med J 50:1150–1153. https://doi.org/10.1111/imj.14882

Article   CAS   PubMed   PubMed Central   Google Scholar  

Tran TH (2021) Scientists built an AI to give ethical advice, but it turned out super racist. Retrieved from Futurism: https://futurism.com/delphi-ai-ethics-racist

Vadapalli P (2021) Top 7 challenges in artificial intelligence in 2022 (Upgrade, Producer). Retrieved from Challenges in AI. https://www.upgrad.com/blog/top-challenges-in-artificial-intelligence/

VanLangen K (2021) Viability of virtual skills-based assessments focused on communication. Am J Pharm Educ 85(7):8378. https://doi.org/10.5688/ajpe8378

Vassileva J (2008) Social learning environments: new challenges for AI in education. In: International conference on intelligent tutoring systems. Springer, Berlin

Vazhayil A, Shetty R (2019) Focusing on teacher education to introduce AI in schools: perspectives and illustrative findings. In: 2019 IEEE 10th international conference on Technology for Education (T4E). IEEE, Goa, https://doi.org/10.1109/T4E.2019.00021

Venema L (2021) Defining a role for AI ethics in national security. Nat Mach Intell 3:370–371. https://doi.org/10.1038/s42256-021-00344-9

Vincent S, van R (2022) Trustworthy artificial intelligence (AI) in education. Organization for Economic Co-operation and Development, pp. 1–17

Weyerer J, Langer P (2019) Garbage in, garbage out: the vicious cycle of ai-based discrimination in the public sector. In: 20th Annual international conference on digital government research. ACM Digital Library, pp. 509–511

Whittaker M, Crawford K (2018) AI now report 2018. New York University, New York

Xie H, Hwang G-J, Wong T-L(2021) From conventional AI to modern AI in education: re-examining AI and analytic techniques for teaching and learning. Educ Technol Soc 24(3):85–88

Youn S (2009) Determinants of online privacy concern and its influence on privacy protection behaviors among young adolescents. J Consum Aff 43(3):389–418

Zhang H, Lee I, Ali S, DiPaola D, Cheng Y, Breazeal C (2022) Integrating ethics and career futures with technical learning to promote AI literacy for middle school students: An exploratory study. Int J Artif Intell Educ 1–35

Download references

Acknowledgements

The research is funded by the Universiti Kuala Lumpur, Kuala Lumpur 50250, Malaysia, under the agreement, UniKL/Cori/iccr/04/21.

Author information

Authors and affiliations.

Institute of Business Management, Karachi, Pakistan

Sayed Fayaz Ahmad

Sejong University, Seoul, Korea

Riphah International University, Islamabad, Pakistan

Muhammad Mansoor Alam

Universiti Kuala Lumpur, Kuala Lumpur, Malaysia

Mohd. Khairul Rehmat

University of Gwadar, Gwadar, Pakistan

Muhammad Irshad

Universidad de Playa Ancha, Valparaiso, Chile

Marcelo Arraño-Muñoz

Universidad Loyola Andalucía, Córdoba, Spain

Antonio Ariza-Montes

You can also search for this author in PubMed   Google Scholar

Contributions

The authors confirm their contribution to the paper as follows: Introduction: SFA and HH; Materials: MMA and MKH, MI; Methods: MI and SFA; Data collection: SFA, MMA, MKR, MI, MMA, and AAM; Data analysis and interpretation SFA, HH, MI, and AAM; Draft preparation: MMA, MKR, MI, MMA, and AAM; Writing and review: SFA, HH, MMA, MKR, MI, MMA, and AAM. All authors read, edited, and finalized the manuscript.

Corresponding authors

Correspondence to Sayed Fayaz Ahmad or Heesup Han .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

The evaluation survey questionnaire and methodology were examined, approved, and endorsed by the research ethics committee of the University of Gwadar on 1 March 2021 (see supplementary information). The study meets the requirements of the National Statement on Ethical Conduct in Human Research (2007). The procedures used in this study adhere to the tents of the declaration of Helsinki.

Informed consent

Informed consent was obtained from all participants before the data was collected. We informed each participant of their rights, the purpose of the study and to safeguard their personal information.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary data, ethics committee approval, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Ahmad, S.F., Han, H., Alam, M.M. et al. Impact of artificial intelligence on human loss in decision making, laziness and safety in education. Humanit Soc Sci Commun 10 , 311 (2023). https://doi.org/10.1057/s41599-023-01787-8

Download citation

Received : 03 October 2022

Accepted : 23 May 2023

Published : 09 June 2023

DOI : https://doi.org/10.1057/s41599-023-01787-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

human thinking and problem solving

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Find the AI Approach That Fits the Problem You’re Trying to Solve

  • George Westerman,
  • Sam Ransbotham,
  • Chiara Farronato

human thinking and problem solving

Five questions to help leaders discover the right analytics tool for the job.

AI moves quickly, but organizations change much more slowly. What works in a lab may be wrong for your company right now. If you know the right questions to ask, you can make better decisions, regardless of how fast technology changes. You can work with your technical experts to use the right tool for the right job. Then each solution today becomes a foundation to build further innovations tomorrow. But without the right questions, you’ll be starting your journey in the wrong place.

Leaders everywhere are rightly asking about how Generative AI can benefit their businesses. However, as impressive as generative AI is, it’s only one of many advanced data science and analytics techniques. While the world is focusing on generative AI, a better approach is to understand how to use the range of available analytics tools to address your company’s needs. Which analytics tool fits the problem you’re trying to solve? And how do you avoid choosing the wrong one? You don’t need to know deep details about each analytics tool at your disposal, but you do need to know enough to envision what’s possible and to ask technical experts the right questions.

  • George Westerman is a Senior Lecturer in MIT Sloan School of Management and founder of the Global Opportunity Forum  in MIT’s Office of Open Learning.
  • SR Sam Ransbotham is a Professor of Business Analytics at the Boston College Carroll School of Management. He co-hosts the “Me, Myself, and AI” podcast.
  • Chiara Farronato is the Glenn and Mary Jane Creamer Associate Professor of Business Administration at Harvard Business School and co-principal investigator at the Platform Lab at Harvard’s Digital Design Institute (D^3). She is also a fellow at the National Bureau of Economic Research (NBER) and the Center for Economic Policy Research (CEPR).

Partner Center

IMAGES

  1. Problem Thinking Inspiration Idea and Success. Human Head Problem

    human thinking and problem solving

  2. Brain Idea and Problem Solving Stock Vector

    human thinking and problem solving

  3. Problem Solving and Decision Making

    human thinking and problem solving

  4. Problem solving concept thinking man and Vector Image

    human thinking and problem solving

  5. Introduction to Problem Solving Skills

    human thinking and problem solving

  6. Problem-Solving Process in 6 Steps

    human thinking and problem solving

VIDEO

  1. “The human organism needs problem solving”

  2. Shocking Facts about the Power of Human Thinking

  3. Critical Thinking & Problem Solving Stress Management- Erica O'Neal

  4. The Cognitive Kaleidoscope: Understanding the Diversity of Human Thinking

  5. True Word's About Human Thinking🙌🏻 #ishacartoonworld🎎❤️

  6. thinking

COMMENTS

  1. (PDF) Thinking and problem solving

    Problem Solving Thinking and problem solving January 2002 Authors: Peter Frensch Humboldt-Universität zu Berlin Joachim Funke Universität Heidelberg

  2. Cognitive Psychology: The Science of How We Think

    What Is Cognitive Psychology? The Science of How We Think By Kendra Cherry, MSEd Updated on December 05, 2022 Medically reviewed by Steven Gans, MD MaskotOwner/Getty Images Table of Contents View All Topics in Cognitive Psychology History Current Research Cognitive Approach in Practice Careers in Cognitive Psychology

  3. 7 Module 7: Thinking, Reasoning, and Problem-Solving

    Metacognition (7.1) Characteristics of critical thinking: skepticism; identify biases, distortions, omissions, and assumptions; reasoning and problem solving skills (7.1)

  4. Cognition in Psychology: Definition, Types, Effects, and Tips

    Cognition is a term referring to the mental processes involved in gaining knowledge and comprehension. Some of the many different cognitive processes include thinking, knowing, remembering, judging, and problem-solving. These are higher-level functions of the brain and encompass language, imagination, perception, and planning.

  5. The Problem-Solving Process

    Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue. The best strategy for solving a problem depends largely on the unique situation.

  6. Thinking, Language, and Problem Solving

    Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory.

  7. Cognitive Psychology: How Scientists Study the Mind

    The cognitive perspective in psychology focuses on how the interactions of thinking, emotion, creativity, and problem-solving abilities affect how and why you think the way you do. Cognitive ...

  8. 7.1 What Is Cognition?

    Cognitive psychology is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes.

  9. Thinking and Problem Solving

    Thinking and Problem-Solving presents a comprehensive and up-to-date review of literature on cognition, reasoning, intelligence, and other formative areas specific to this field. Written for advanced undergraduates, researchers, and academics, this volume is a necessary reference for beginning and established investigators in cognitive and ...

  10. Complex cognition: the science of human reasoning, problem-solving, and

    Under the headline "complex cognition", we summarize mental activities such as thinking, reasoning, problem - solving, and decision - making that typically rely on the combination and interaction of more elementary processes such as perception, learning, memory, emotion, etc. (cf. Sternberg and Ben-Zeev 2001 ).

  11. Ch 8: Thinking and Language

    Thinking and Problem-Solving Learning Objectives Distinguish between concepts and prototypes Explain the difference between natural and artificial concepts Describe problem solving strategies, including algorithms and heuristics Explain some common roadblocks to effective problem solving What is Cognition? Cognition

  12. What Is Cognition?

    Cognitive psychology is the field of psychology dedicated to examining how people think. It attempts to explain how and why we think the way we do by studying the interactions among human thinking, emotion, creativity, language, and problem solving, in addition to other cognitive processes.

  13. PDF THINKING AND PROBLEM SOLVING

    Human thinking, and in particular, the human ability to solve complex, real-life problems contributes more than any other human ability to the development of human culture and the growth and development of human life on earth.

  14. 7.1 What is Cognition?

    cognition: thinking, including perception, learning, problem solving, judgment, and memory. cognitive psychology: field of psychology dedicated to studying every aspect of how people think. concept: category or grouping of linguistic information, objects, ideas, or life experiences.

  15. Thinking and Intelligence

    Simply put, cognition is thinking, and it encompasses the processes associated with perception, knowledge, problem solving, judgment, language, and memory.

  16. PDF CHAPTER 1 Thinking and Reasoning: A Reader's Guide

    Keith J. Holyoak Robert G. Morrison. "Cogito, ergo sum," the French philosopher Ren ́e Descartes famously declared, "I think, therefore I am.". Every normal human adult shares a sense that the ability to think, to rea-son, is a part of their fundamental identity. A person may be struck blind or deaf, yet we still recognize his or her ...

  17. Thinking & Reasoning

    Thinking & Reasoning is dedicated to the understanding of human thought processes, with particular emphasis on studies on reasoning, decision-making, and problem-solving. Whilst the primary focus is on psychological studies of thinking, contributions are welcome from philosophers, artificial intelligence researchers and other cognitive scientists whose work bears upon the central concerns of ...

  18. Computer Simulation of Human Thinking and Problem Solving

    sense, of the corresponding human processes. These theories are testable in a number of ways, among them, by comparing the symbolic behavior of a computer so programmed with the symbolic behavior of a human subject when both are performing the same problem-solving or thinking tasks. THE GENERAL PROBLEM SOLVER

  19. Frontiers

    It also implies effective communication and the development of problem solving skills (Saiz and Rivas, 2008, 2012, 2016). Critical thinking is characterized by generating higher-level cognitive processing in people, centered on the skills of reflecting, comprehension, evaluation and creation. It therefore requires high intellectual development.

  20. Metacognition's Role in Decision Making

    Metacognition can help us to think outside the box. More complex decisions require problem-solving, strategies, re-framing, creative thinking, and possibly seeking advice from others.In addition ...

  21. Thinking and problem solving: An introduction to human cognition and

    Citation. Mayer, R. E. (1977). Thinking and problem solving: An introduction to human cognition and learning. Scott, Foresman. Abstract. Text: book; for ...

  22. Computer Simulation of Human Thinking

    Human Thinking A theory of problem solving expressed as a computer program permits simulation of thinking processes. Allen Newell and Herbert A. Simon The path of scientific investigation in any field of knowledge records a response to two opposing pulls. On the one side, a powerful attraction is ex-erted by "good problems"-questions

  23. Thinking and problem solving: An introduction to human cognition and

    2016. TLDR. This paper provides a detailed account of how a complex network of pedagogies, theories, and technologies were brought together to design and develop the GDL program, with the purpose of teaching students basics of computer programming, and to give them hands-on experiences in game-design, and teach them complex problem-solving skills.

  24. Impact of artificial intelligence on human loss in decision making

    6 Citations 70 Altmetric Metrics A Correction to this article was published on 29 June 2023 This article has been updated Abstract This study examines the impact of artificial intelligence (AI) on...

  25. Human Thinking and Problem Solving

    Problem solving is a mental process and considered to be one of the higher levels of cognitive processes. In 1966, Gary A. Davis stated in one of his studies that research in human problem solving has gained a good reputation for being the most disorderly of all recognizable categories of human learning.

  26. PDF Fostering Problem Solving and Critical Thinking in Mathematics Through

    We will show how problem solving and critical thinking continue to be pivotal in solving mathematical problems, even if this is performed with the aid of AI, since the AI can often help the user up to a certain ... Wang, Y. and Chiew, V., 2010. On the cognitive process of human problem solving. Cognitive Systems Research, 11(1), pp. 81-92, DOI ...

  27. Find the AI Approach That Fits the Problem You're Trying to Solve

    Summary. AI moves quickly, but organizations change much more slowly. What works in a lab may be wrong for your company right now. If you know the right questions to ask, you can make better ...

  28. Meridith May Curtis on Instagram: "Unlocking creativity is like opening

    9 likes, 0 comments - the_curtis_mom on February 5, 2024: "Unlocking creativity is like opening a door to endless possibilities. Today, let's talk..."