![]() ![]() In addition, in analogy to symbolic functions in mathematics, there were symbolic logical predicates: not just explicit statements like x > y but also ones like p( x, y) for symbolic p. But now there needed to be notation for quantifiers (“for all x such-and-such”, or “there exists x such that…”). In logic, there had long been the idea of letters ( p, q, …) standing for propositions (“it is raining now”). In addition to algebraic variables like x, there was the notion of symbolic functions f, as in f( x). By the end of the 1800s, however, there was a clear need to extend and generalize how one wrote mathematics. For a while new concepts-like Boolean algebra-tended to just piggyback on existing notation. But by the end of the 1600s mathematical notation like +, =, > had been established. At first, everything was just words and ordinary language. ![]() What Is Mathematics-and Logic-Made Of?īefore one could really dig into the idea of “building mathematics from logic” one had to have ways to “write mathematics” and “write logic”. But back then, everything about this and the ideas around it had to be invented. Today we would recognize these efforts as “writing programs” for numbers and arithmetic in a “machine code” based on certain “instructions of logic”. But could all of mathematics actually just be a story of deduction, perhaps even ultimately derivable from something seemingly lower level-like logic?īut if so, what would things like numbers and arithmetic be? Somehow they would have to be “constructed out of pure logic”. But what really was mathematics? Was it a formal way of describing the world, or was it something else-perhaps something that could exist without any reference to the world?ĭevelopments like non-Euclidean geometry, group theory and transfinite numbers made it seem as if meaningful mathematics could indeed be done just by positing abstract axioms from scratch and then following a process of deduction. Through the course of the 1700s and 1800s mathematics had developed a more and more elaborate formal structure that seemed to be reaching ever further. The main part of the story begins in the 1800s. And it’s becoming clear that the modern conception of computation is one of the single most powerful ideas in all of intellectual history-whose implications are only just beginning to unfold.īut how did we finally get to it? Combinators had an important role to play, woven into a complex tapestry of ideas stretching across more than a century. Logic was for modeling the structure of arguments, Euclid’s geometry the properties of space, algebra the properties of numbers Boolean algebra aspired to model the “laws of thought”.īut was there perhaps some more general and fundamental infrastructure: some kind of abstract system that could ultimately model or represent anything? Today we understand that’s what computation is. But each, in a sense, ultimately viewed itself as being set up to model something specific. ![]() Each of these was a formal system that allowed one to make deductions purely within the system. By the 1400s there was algebra, and in the 1840s Boolean algebra. In antiquity there was Aristotle’s logic and Euclid’s geometry. The idea of representing things in a formal, symbolic way has a long history. But tracing their history over the hundred years since they were invented, I’ve come to realize just how critical they’ve actually been to the development of our modern conception of computation-and indeed my own contributions to it. But the implication tends to be “But you probably don’t want to.” And, yes, combinators are deeply abstract-and in many ways hard to understand. ![]() “In principle you could use combinators,” some footnote might say. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |