The alphabet of human thought (Latin: alphabetum cogitationum humanarum) is a concept originally proposed by Gottfried Wilhelm Leibniz that provides a universal way to represent and analyze ideas and relationships by breaking down their component pieces. All ideas are compounded from a very small number of simple ideas which can be represented by a unique character.
Logic was Leibniz's earliest philosophic interest, going back to his teens. René Descartes had suggested that the lexicon of a universal language should consist of primitive elements. The systematic combination of these elements, according to syntactical rules, would generate the infinite combinations of computational structures required to represent human language. In this way Descartes and Leibniz were precursors to computational linguistics as defined by Noam Chomsky.
In the early 18th century, Leibniz outlined his characteristica universalis, an artificial language in which grammatical and logical structure would coincide, which would allow reasoning to be reduced to calculation. Leibniz acknowledged the work of Ramon Llull, particularly the Ars generalis ultima (1305), as one of the inspirations for this idea. The basic elements of his characteristica would be pictographic characters representing unambiguously a limited number of elementary concepts. Leibniz called the inventory of these concepts "the alphabet of human thought." There are quite a few mentions of the characteristica in Leibniz's writings, but he never set out any details save for a brief outline of some possible sentences in his Dissertation on the Art of Combinations.
His main interest was what is known in modern logic as classification and composition. In modern terminology Leibniz's alphabet was a proposal for an automated theorem prover or ontology classification reasoner written centuries before the technology to implement them.
John Giannandrea, co-founder and CTO of Metaweb Technologies, acknowledged in a speech that Freebase was at least linked to the alphabet of human thought, if not an implementation of it.
he offered a new vision of the natural world that continues to shape our thought today: a world of matter possessing a few fundamental properties and interacting according to a few universal laws.
I mentioned that modern generative grammar has sought to address concerns that animated the tradition; in particular, the Cartesian idea that "the true distinction" (Descartes 1649/1927: 360) between humans and other creatures or machines is the ability to act in the manner they took to be most clearly illustrated in the ordinary use of language: without any finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice.
his main emphasis... was on classification, deduction was a natural consequence of combining classified items into new classes.