Self-deception is a process of denying or rationalizing away the relevance, significance, or importance of opposing evidence and logical argument. Self-deception involves convincing oneself of a truth (or lack of truth) so that one does not reveal any self-knowledge of the deception.
While Freudian analysis of the conscious and the unconscious minds dominated the field, more and more psychological scientists became curious about how those two seemingly separate worlds could work together in the 70s. However, due to the lack of mechanistic models available to this line of research, the debate was put on pause. Later, the focus has been shifted to vision-related research in social psychology.
The traditional paradigm of self-deception is modeled after interpersonal deception, where A intentionally gets B to believe some proposition p, all the while knowing or believing truly ¬p (not p). Such deception is intentional and requires the deceiver to know or believe ¬p and the deceived to believe p. On this traditional mode, self-deceivers must (1) hold contradictory beliefs and (2) intentionally get themselves to hold a belief they know or believe truly to be false.
The process of rationalization, however, can obscure the intent of self-deception. Brian McLaughlin illustrates that such rationalizations in certain circumstances permit the phenomenon. When a person, who disbelieves p, intentionally tries to make himself believe or continue believing p by engaging in such activities, and, as a result unintentionally misleads himself into believing or continuing to believe p via biased thinking, he deceives himself in a way appropriate for self-deception. No deceitful intention is required for this.
Self-deception calls into question the nature of the individual, specifically in a psychological context and the nature of "self". Irrationality is the foundation upon which the argued paradoxes of self-deception stem, and it is argued[by whom?] that not everyone has the "special talents" and capacities for self-deception. However, rationalization is influenced by a myriad of factors, including socialization, personal biases, fear, and cognitive repression. Such rationalization can be manipulated in both positive and negative fashions; convincing one to perceive a negative situation optimistically and vice versa. In contrast, rationalization alone cannot effectively clarify the dynamics of self-deception, as reason is just one adaptive form mental processes can take.
The works of philosopher Alfred R. Mele have provided insight into some of the more prominent paradoxes regarding self-deception. Two of these paradoxes include the self-deceiver's state of mind and the dynamics of self-deception, coined the "static" paradox and the "dynamic/strategic" paradox, respectively.
Mele formulates an example of the "static" paradox as the following:
If ever a person A deceives a person B into believing that something, p, is true, A knows or truly believes that p is false while causing B to believe that p is true. So when A deceives A (i.e., himself) into believing that p is true, he knows or truly believes that p is false while causing himself to believe that p is true. Thus, A must simultaneously believe that p is false and believe that p is true. But how is this possible?
Mele then describes the "dynamic/strategy" paradox:
In general, A cannot successfully employ a deceptive strategy against B if B knows A's intention and plan. This seems plausible as well when A and B are the same person. A potential self-deceiver's knowledge of his intention and strategy would seem typically to render them ineffective. On the other hand, the suggestion that self-deceivers typically successfully execute their self-deceptive strategies without knowing what they are up to may seem absurd; for an agent's effective execution of his plans seems generally to depend on his cognizance of them and their goals. So how, in general, can an agent deceive himself by employing a self-deceptive strategy?
These models call into question how one can simultaneously hold contradictory beliefs ("static" paradox) and deceive oneself without rendering one's intentions ineffective ("dynamic/strategic" paradox). Attempts at a resolution to these have created two schools of thought: one that maintains that paradigmatic cases of self-deception are intentional and one that denies the notion—intentionalists and non-intentionalists, respectively.
Intentionalists tend to agree that self-deception is intentional, but divide over whether it requires the holding of contradictory beliefs. This school of thought incorporates elements of temporal partitioning (extended over time to benefit the self-deceiver, increasing the chance of forgetting the deception altogether) and psychological partitioning (incorporating various aspects of the "self").
Non-intentionalists, in contrast, tend to believe that cases of self-deception are not necessarily accidental, but motivated by desire, anxiety, or some other emotion regarding p or related to p. This notion distinguishes self-deception from misunderstanding. Furthermore, "wishful thinking" is distinguished from self-deception in that the self-deceivers recognize evidence against their self-deceptive belief or possess, without recognizing, greater counterevidence than wishful thinkers.
Numerous questions and debates remain in play with respect to the paradoxes of self-deception, and a consensual paradigm has yet to appear.
It has been theorized that humans are susceptible to self-deception because most people have emotional attachments to beliefs, which in some cases may be irrational. Some evolutionary biologists, such as Robert Trivers, have suggested[page needed] that deception plays a significant role in human behavior, and more generally speaking in animal behavior. One deceives oneself to trust something that is not true as to better convince others of that truth. When a person convinces himself of this untrue thing, they better mask the signs of deception. Trivers, along with two colleagues (Daniel Kriegman and Malcolm Slavin), applied his theory of "self-deception in the service of deception" in order to explain how Donald Trump was able to employ the "big lie" with such great success.
This notion is based on the following logic: deception is a fundamental aspect of communication in nature, both between and within species. It has evolved so that one can have an advantage over another. From alarm calls to mimicry, animals use deception to further their survival. Those who are better able to perceive deception are more likely to survive. As a result, self-deception behavior evolved to better mask deception from those who perceive it well or, as Trivers puts it "hiding the truth from yourself to hide it more deeply from others." In humans, awareness of the fact that one is acting deceptively often leads to tell-tale signs of deception, such as nostrils flaring, clammy skin, quality and tone of voice, eye movement, or excessive blinking. Therefore, if self-deception enables an individual to believe its own distortions, it will not present such signs of deception, and will therefore appear to be telling the truth.
Self-deception can be used both to act greater or lesser than one actually is. For example, one can act overconfident to attract a mate or act under-confident to avoid a threat such as a predator. If an individual is capable of concealing their true feelings and intentions well, then it is more likely to successfully deceive others.
It may also be argued that the ability to deceive, or self-deceive, is not the selected trait but rather a by-product of a more primary trait called abstract thinking. Abstract thinking allows many evolutionary advantages such as more flexible, adaptive behaviors, leading to innovation. Since a lie is an abstraction, the mental process of creating it can only occur in animals with enough brain complexity to permit abstract thinking. Moreover, self-deception lowers cognitive cost; that is to say, if one has convinced oneself that that very thing is indeed true, it is less complicated for one to behave or think as that thing was untrue; the mind not thinking constantly of the true thing and then the false thing, but simply being convinced that the false thing is true.
Because there is deceit, there exists a strong selection to recognize when deception occurs. As a result, self-deception behavior evolves so as to better hide the signs of deception from others. The presence of deception explains the existence of an innate ability to commit self-deception to hide the indications of deceptions. Humans deceive themselves in order to better deceive others and thus have an advantage over them. In the three decades since Trivers introduced his adaptive theory of self-deception, there has been an ongoing debate over the genetic basis of such a behavior.
The explanation of deception and self-deception as innate characteristics is perhaps true, but there are many other explanations for this pattern of behavior. It is possible that the ability to self-deceive is not innate, but a learned trait, acquired through experience. For example, a person could have been caught being deceitful by revealing their knowledge of information they were trying to hide. Their nostrils flared, indicating that they were lying to the other person, and thus did not get what they wanted. Next time, to better achieve success, the person will more actively deceive himself of having knowledge to better hide the signs of deception. Therefore, people could have the capacity to learn self-deception. However, simply because something is learned does not mean that it is not innate; what is learned and what is innate work in conjunction. This is outlined in many introductory textbooks in evolutionary psychology. For example, preparedness occurs in learning to explain why some behaviours are more easily learned than others. Evolutionary psychologists argue that there are learning mechanisms that allow learning to occur.
Self-deception has a prominent role in several medical conditions, such as borderline personality disorder, narcissistic personality disorder, and histrionic personality disorder.
Simple instances of self-deception include common occurrences such as: the alcoholic who is self-deceived in believing that his drinking is under control, the husband who is self-deceived in believing that his wife is not having an affair, the jealous colleague who is self-deceived in believing that her colleague's greater professional success is due to ruthless ambition.
An example of self-deception is provided by Robert Trivers and Huey P. Newton published in the form of an analysis of the role of flight crew self-deception in the crash of Air Florida Flight 90.
The claim that not being conscious about deception would decrease the body language signs of lying is criticized for being incompatible with the unconscious nature of body language as in body language giving away non-conscious processes, as well as for not being able to account for why evolutionary selection for lying would allow a body language that gives away lying to exist instead of simply selecting for lack of such signals.
The notion that non-conscious deception would be less costly than conscious deception is subject to criticism, citing that a non-conscious lie followed by a process of creating a conscious confabulation would amount to more, not fewer, brain processes than simply making up a conscious lie.
The concept of self-deception is criticized for being able to classify any criticism of the notion of self-deception as being self-deception in itself, removing its falsifiability and therefore making it unscientific, and also for being an obstacle to science in general by being able to classify anything as self-deception in a way that confirms itself in a way that is not self-correcting.
The assumption that individuals who derive pleasure from hurting others would self-deceive into believing that their victims were not hurt is criticized for contradicting its own premise, since if the individual did enjoy knowing that the victim was hurt such self-deception would reduce and not increase the pleasure.