I discovered Geometric Algebra (GA) back in 2003 and it caught my attention immediately. In my whole life as a student, engineer, researcher, and teacher I’ve never met a symbolic mathematical system so beautifully close to geometric abstractions. In this post, I try to explain how Geometric Algebra can express, unify, and generalize many geometric abstractions we use as engineers and computer scientists.
In the previous post we read how our languages and ideas about the computational universe are best expressed using either a general purpose language with balanced human-computational aspects like Python or using a domain-specific language carefully crafted for our particular domain of abstractions. In this post, we will focus our attention on a specific domain for computation, geometric computing, and the design of geometry-specific programming languages.
Computing is done by programming computers, programming requires programming languages, and programming languages come in many forms and flavors. The creative process of software development, in general, is certainly related to language, thought, and imagination. For geometric modeling and geometric processing applications, the correct selection of a programming language is absolutely fundamental.
There are 3 kinds of science: the experimental, the theoretical, and the simulated. The third kind of scientific activity only appeared recently, about 75 years ago, when the first electronic computers were made; effectively creating the “human computational universe” and upgrading our scientific methods to a whole new level. The idea of this third kind of science is to computationally and visually investigate our theoretical mathematical models encoded as computer programs executed on various sets of inputs to get new patterns, ideas, and “virtual” discoveries that can be verified experimentally later or at least may provide grounds for new abstractions, theories, and practical applications. This third kind of science, the science and art of computer simulations, is now unavoidable in all scientific research and education activities. All this is made possible by using only the two numbers 1 and 0; a.k.a True and False.
After our journey with classic numbers in part one and geometric numbers in part two, in this final part of our functional history of numbers, we will take a look at a third kind of numbers: the computational numbers.
In part one of this functional history of numbers we saw the development of various number systems we are mostly familiar with. In this part, we will see the development of many number systems that are important for our modern scientific needs, geometrically and computationally. The sad fact about these developments is that we are using and teaching less effective number systems today because of a “series of unfortunate events” that took place during the grand drama of human development of modern mathematics.
The main goal of this post is to link Geometric Algebra to mathematics on the fundamental level of numbers. Here I briefly describe the history of numbers with emphasis on their functional role in mathematics, science, and engineering to put the computational role GA can play into perspective.
I’ve been dealing with mathematical abstractions most of my life, as a student then as a software engineer and faculty member, on various levels and forms. My experience is like trying to find a safe path in a big forest of data and ideas that keep on growing and changing each day.