Richard C. Jeffrey
BornAugust 5, 1926
DiedNovember 9, 2002
Alma materPrinceton University
Era20th-century philosophy
RegionWestern philosophy
SchoolAnalytic philosophy
Main interests
Decision theory, epistemology
Notable ideas
Radical probabilism, Jeffrey conditioning, truth tree method for syllogism testing[1]

Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of probability kinematics, also known as Jeffrey conditioning.

Life and career

Born in Boston, Massachusetts, Jeffrey served in the U.S. Navy during World War II. As a graduate student he studied under Rudolf Carnap and Carl Hempel.[2] He received his M.A. from the University of Chicago in 1952 and his Ph.D. from Princeton in 1957. After holding academic positions at MIT, City College of New York, Stanford University, and the University of Pennsylvania, he joined the faculty of Princeton in 1974 and became a professor emeritus there in 1999. He was also a visiting professor at the University of California, Irvine.[3]

Jeffrey, who died of lung cancer at the age of 76, was known for his sense of humor, which often came through in his breezy writing style. In the preface of his posthumously published Subjective Probability, he refers to himself as "a fond foolish old fart dying of a surfeit of Pall Malls".[4]

Philosophical work

As a philosopher, Jeffrey specialized in epistemology and decision theory. He is perhaps best known for defending and developing the Bayesian approach to probability.

Jeffrey also wrote, or co-wrote, two widely used and influential logic textbooks: Formal Logic: Its Scope and Limits, a basic introduction to logic, and Computability and Logic, a more advanced text dealing with, among other things, the famous negative results of twentieth-century logic such as Gödel's incompleteness theorems and Tarski's indefinability theorem.

Radical probabilism

Main article: Radical probabilism

In frequentist statistics, Bayes' theorem provides a useful rule for updating a probability when new frequency data becomes available. In Bayesian statistics, the theorem itself plays a more limited role. Bayes' theorem connects probabilities that are held simultaneously. It does not tell the learner how to update probabilities when new evidence becomes available over time. This subtlety was first pointed out in terms by Ian Hacking in 1967.[5]

However, adapting Bayes' theorem, and adopting it as a rule of updating, is a temptation. Suppose that a learner forms probabilities Pold(A&B)=p and Pold(B)=q. If the learner subsequently learns that B is true, nothing in the axioms of probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew(A) = Pold(A | B) = p/q.

In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamic Dutch book argument that is additional to the arguments used to justify the axioms. This argument was first put forward by David Lewis in the 1970s though he never published it.[6]

That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain".[7] There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart from a logical law, can ever be certain, if that. Jeffrey famously rejected Lewis' dictum and quipped, "It's probabilities all the way down." He called this position radical probabilism.

In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem.[8]

Pnew(A) = Pold(A | B)Pnew(B) + Pold(A | not-B)Pnew(not-B)

Adopting such a rule is sufficient to avoid a Dutch book but not necessary.[9] Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.

It is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle and Brian Skyrms' principle of reflection.

Jeffrey conditioning can be generalized from partitions to arbitrary condition events by giving it a frequentist semantics.[10]

See also

Selected bibliography


  1. ^ Richard Jeffrey, John P. Burgess (editor), Formal Logic: Its Scope and Limits (4th ed.), Hackett Publishing, 2006, p. 21; cf. Wayne Grennan, Informal Logic: Issues and Techniques, McGill-Queen's University Press, 1997, p. 108.
  2. ^ Jeffrey, Richard. "A Proposal to the National Science Foundation for Support of Research on Carnap's Inductive Logic" (PDF). Richard Jeffery's Papers. Special Collections Department, University of Pittsburgh. Retrieved September 17, 2013.
  3. ^ Princeton University Department of Philosophy. "Richard C. Jeffrey". Retrieved July 11, 2017.
  4. ^ pxii
  5. ^ Hacking, Ian (1967). "Slightly more realistic personal probability". Philosophy of Science. 34 (4): 311–325. doi:10.1086/288169. S2CID 14344339.
  6. ^ Skyrms, Brian (1987). "Dynamic coherence and probability kinematics". Philosophy of Science. 54: 1–20. CiteSeerX doi:10.1086/289350. S2CID 120881078.
  7. ^ Lewis, C. I. (1946). An Analysis of Knowledge and Valuation. La Salle, Illinois: Open Court. p. 186.
  8. ^ Jeffrey, Richard (1987). "Alias Smith and Jones: The testimony of the senses". Erkenntnis. 26 (3): 391–399. doi:10.1007/bf00167725. S2CID 121478331.
  9. ^ Skyrms (1987)
  10. ^ Draheim, Dirk (2017). "Generalized Jeffrey Conditionalization (A Frequentist Semantics of Partial Conditionalization)". Springer. Retrieved December 19, 2017.