The moralistic fallacy is the informal fallacy of assuming that an aspect of nature which has socially unpleasant consequences cannot exist. Its typical form is "if X were true, then it would happen that Z!", where Z is a morally, socially or politically undesirable thing. What should be moral is assumed a priori to also be naturally occurring. The moralistic fallacy is sometimes presented as the inverse of the naturalistic fallacy. However, it could be seen as a variation of the very same naturalistic fallacy; the difference between them could be considered pragmatical, depending on the intentions of the person who uses it: naturalistic fallacy if the user wants to justify existing social practices with the argument that they are natural; moralistic fallacy if the user wants to combat existing social practices with the argument of denying that they are natural.
Steven Pinker writes that "[t]he naturalistic fallacy is the idea that what is found in nature is good. It was the basis for social Darwinism, the belief that helping the poor and sick would get in the way of evolution, which depends on the survival of the fittest. Today, biologists denounce the naturalistic fallacy because they want to describe the natural world honestly, without people deriving morals about how we ought to behave (as in: If birds and beasts engage in adultery, infanticide, cannibalism, it must be OK)." Pinker goes on to explain that "[t]he moralistic fallacy is that what is good is found in nature. It lies behind the bad science in nature-documentary voiceovers: lions are mercy-killers of the weak and sick, mice feel no pain when cats eat them, dung beetles recycle dung to benefit the ecosystem and so on. It also lies behind the romantic belief that humans cannot harbor desires to kill, rape, lie, or steal because that would be too depressing or reactionary."
Sometimes basic scientific findings or interpretations are rejected, or their discovery, development or acknowledgement is opposed or restricted, through assertions of potential misuse or harmfulness.
In the late 1970s, Bernard Davis, in response to growing political and public calls to restrict basic research (versus applied research), amid criticisms of dangerous knowledge (versus dangerous applications), applied the term moralistic fallacy toward its present use.
(The term was used as early as 1957 to at least some if differing import.)
In natural science, the moralistic fallacy can result in rejection or suppression of basic science, whose goal is understanding the natural world, on account of its potential misuse in applied science, whose goal is the development of technology or technique. This blurs scientific assessment, discussed in natural sciences (like physics or biology), versus significance assessment, weighed in social sciences (like social psychology, sociology, and political science), or in behavioral sciences (like psychology).
Davis asserted that in basic science, the descriptive, explanatory, and thus predictive ability of information is primary, not its origin or its applications, since knowledge cannot be ensured against misuse, and misuse cannot falsify knowledge. Both misuse of scientific work and suppression of scientific knowledge can have undesired or even undesirable effects. In the early 20th century, the development of quantum physics made possible the atomic bomb in the mid 20th century. Without quantum physics, however, much of the technology of communications and imaging might have been impossible.
Scientific theories with abundant research support can be discarded in public debates, where general agreement is central but can be utterly false. The obligation of basic scientists to inform the public, however, can be stymied by contrasting claims from others both rousing alarm and touting assurances of protecting the public. Davis had indicated that greater and clearer familiarization with the uses and limitations of science can more effectively prevent knowledge misuse or harm.
Natural science can help humans understand the natural world, but it cannot make policy, moral, or behavioral decisions. Questions involving values—what people should do—are more effectively addressed through discourse in social sciences, not by restriction of basic science. Misunderstanding of the potential of science, and misplaced expectations, have resulted in moral and decisionmaking impediments, but suppressing science is unlikely to resolve these dilemmas.
The Seville Statement on Violence was adopted, in Seville, Spain, on 16 May 1986, by an international meeting of scientists convened by the Spanish National Commission for UNESCO. UNESCO adopted the statement, on 16 November 1989, at the twenty-fifth session of its General Conference. The statement purported to refute "the notion that organized human violence is biologically determined".[page needed]
Some, including Steven Pinker, have criticized the Seville Statement as an example of the moralistic fallacy. Research in the areas of evolutionary psychology and neuropsychology suggest that human violence has biological roots.