The alphabet of human thought (Latin: alphabetum cogitationum humanarum) is a concept originally proposed by Gottfried Wilhelm Leibniz that provides a universal way to represent and analyze ideas and relationships by breaking down their component pieces.[1] All ideas are compounded from a very small number of simple ideas which can be represented by a unique character.[2][3]
Overview
editLogic was Leibniz's earliest philosophic interest, going back to his teens. René Descartes had suggested that the lexicon of a universal language should consist of primitive elements.[4] The systematic combination of these elements, according to syntactical rules, would generate the infinite combinations of computational structures required to represent human language. In this way Descartes and Leibniz were precursors to computational linguistics as defined by Noam Chomsky.[5]
In the early 18th century, Leibniz outlined his characteristica universalis, an artificial language in which grammatical and logical structure would coincide, allowing reasoning to be reduced to calculation. Leibniz acknowledged the work of Ramon Llull, particularly the Ars generalis ultima (1305), as one of the inspirations for this idea. The basic elements of his characteristica would be pictographic characters unambiguously representing a limited number of elementary concepts. Leibniz called the inventory of these concepts "the alphabet of human thought." There are quite a few mentions of the characteristica in Leibniz's writings, but he never set out any details save for a brief outline of some possible sentences in his Dissertation on the Art of Combinations.
His main interest was what is known in modern logic as classification and composition. In modern terminology, Leibniz's alphabet was a proposal for an automated theorem prover or ontology classification reasoner written centuries before the technology to implement them.[6]
Semantic web implementation
editJohn Giannandrea, co-founder and CTO of Metaweb Technologies, acknowledged in a 2008 speech that Freebase was at least linked to the alphabet of human thought, if not an implementation of it.[7]
See also
editReferences
edit- ^ Leibniz, De Alphabeto cogitationum humanorum, (April 1679 to April 1681 (?)), Akademie VI.4 p.270
- ^ Geiger, Richard A.; Rudzka-Ostyn, Brygida, eds. (1993). Conceptualizations and mental processing in language. International Cognitive Linguistics Conference (1 : 1989 : Duisburg). Walter de Gruyter. pp. 25–26. ISBN 978-3-11-012714-0.
- ^ Bunnin, Nicholas; Jiyuan Yu (2004). The Blackwell Dictionary of Western Philosophy. Blackwell Publishing. p. 715. ISBN 978-1-4051-0679-5.
- ^ Hatfield, Gary (3 December 2008). "René Descartes, The Stanford Encyclopedia of Philosophy (Summer 2014 Edition)". plato.stanford.edu. Stanford University. Retrieved 12 July 2014.
he offered a new vision of the natural world that continues to shape our thought today: a world of matter possessing a few fundamental properties and interacting according to a few universal laws.
- ^ Chomsky, Noam (13 April 2000). New Horizons in the Study of Language and Mind (Kindle ed.). Cambridge University Press. pp. 425–428. ISBN 0521658225.
I mentioned that modern generative grammar has sought to address concerns that animated the tradition; in particular, the Cartesian idea that "the true distinction" (Descartes 1649/1927: 360) between humans and other creatures or machines is the ability to act in the manner they took to be most clearly illustrated in the ordinary use of language: without any finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice.
- ^ Russell, L.J. (1985). "Leibniz, Gottfried Wilhelm". In Paul Edwards (ed.). The Encyclopedia of Philosophy Volumes 3 and 4. Macmillan Publishing. pp. 422–423. ASIN B0017IMQME.
his main emphasis... was on classification, deduction was a natural consequence of combining classified items into new classes.
- ^ "PARCForum Presentation by Giannandrea, J." YouTube. 12 September 2012. min 37+. Retrieved 2015-10-30.