return to table of content

Look, ma, no matrices

rhelz
19 replies
23h55m

Geometric Algebra was a complete mystery to me until I finally realized: it is just polynomial multiplication, but with some quantities for which the order of multiplication matters, and which have a weird multiplication table: i*i = 1, i*j = -j*i. That's it. Most intros present geometric product of two vectors:

(x1*i + y1*j) * (x2*i + y2*j)

as some deep mysterious thing, but its just the same FOIL polynomial multiplication you learned in freshman algebra:

(x1*i + y1*i)(x2*i+y2*j) = x1*x2*i*i + x1*y2*i*j + y1*x2*j*i + y1*y2*j*j

= (x1*x2 + y1*y2) + (x1*y2 - y2*x1)*i*j

The quantity in the first parenthesis, above, is the our old familiar dot product. The quantity in the second parenthesis is our old friend the cross product, but expressed in a new dimension whose basis is i*j, and which--unlike the cross product--generalizes to any number of dimensions. In GA its called the "wedge product".

Once you "get" that, you find that doing things like deriving rotation formulas, etc, become easy, because you can apply all the skills you developed in algebra to solving geometric problems.

skhunted
7 replies
23h46m

Your comment is interesting to me. A few days ago someone asked why math classes don’t the how and why and rather tend to just present the formulas for the operations and tell people to compute. Here you are focusing on what the operations do rather than on why they work. It’s an interesting contrast and one that teachers of mathematics have to balance. The two questions,

How does it work?

Why does it work?

Can’t always both be answered well in a given course.

rhelz
1 replies
23h10m

// focusing on what the operations do rather than why //

YMMV, of course, but in general, I always found it easier to understand the why once I understood the what and the how, rather than the other way around.

skhunted
0 replies
20h23m

Yes. Usually, though, when someone complains about the teaching of mathematics they say we focus too much on how to do the operations and not enough on why they work the way they do. I agree it is much easier to understand why after knowing how.

Nevermark
1 replies
17h9m

How does it work?

Why does it work?

Where can I use it?

ducttapecrown
0 replies
16h34m

This is because "how does it work" and "why does it work" are Fourier transforms of each other!

aleph_minus_one
0 replies
8h20m

A few days ago someone asked why math classes don’t the how and why and rather tend to just present the formulas for the operations and tell people to compute.

The lecturer typically do know quite well the how and why, but teaching these points takes a lot of time (you also have to explain a lot about the practical application until you are able to explain why this mathematical structure is helpful for the problem).

Since lecturers are typically very short of time in lectures, they teach the mathematics in a concise way and trust the students to be adults, capable of going to the library, and reading textbooks about the how and why by themselves if they are interested in this. At least in Germany, there is the clear mentality that if you are not capable of doing this, a university is a wrong place for you; you should better get vocational training instead.

ajkjk
0 replies
14h11m

I think there's a big difference between "presenting a formula" and "presenting rules for doing calculations". People are very good at extrapolating complicated results from a few simple rules: that's why taking derivatives and doing (elementary) integrals is fairly easy and also fairly easy to remember a long time after you've taken a calculus course. On the other hand, a bunch of literal miscellaneous formulas is very hard to hold on to --- for instance that's how introductory physics is taught, a bunch of disjoint relationships that you have to make sense of in your mind to make any use of.

In fact all anyone really wants for vector calculus is a bunch of "tools" they can use that will generally give the right answer if applied mindlessly. I think that's why GA is relatively popular, because it says how to do basic geometric operations (rotations, reflections, etc) without any thought.

noqc
4 replies
22h39m

One of the things that it takes the longest to learn in mathematics is that most things are defined in the dumbest way possible.

In particular, if (over a vector space V) you want to define a bilinear product m:V x V -> V, this is exactly the same thing as deciding on m just over pairs of basis vectors. If I were to call that "the universal property of the tensor product", you'd probably say "uh huh".

ndriscoll
3 replies
21h15m

It's annoyingly one of those things that once you understand, you can't see how you didn't understand it before. e.g. the tensor algebra (aka free algebra) over a vector space is "just" the dumbest possible multiplication: if i and j are basis vectors, then i*j = i⊗j. No further simplification possible. j*i*i*j = j⊗i⊗i⊗j, etc. with associativity and distributivity and linearity: (5i+3j)*k = 5i⊗k + 3j⊗k, etc.

Then, if you have a dot product, a Clifford algebra is "just" the tensor algebra with a single simple extra reduction rule: For a vector v, v*v = v⋅v. So now e.g. `kjiijl = kj(i⋅i)jl = kjjl = k(j⋅j)l = kl = k⊗l`, which can't be reduced further.

The real magic is that it turns out you can prove[0] that ij=-ji for two basis vectors i,j, so e.g. (ij)^2 = ijij = -ijji = -ii = -1. So `ij` is a square root of -1, as is `ik` and `jk` (but they're different square roots of -1), and you get things like complex numbers or quaternions for free, and suddenly a ton of geometry appears (with `ij` corresponding to a chunk of plane in the same way a basis vector `i` is a chunk of line/arrow. `ijk` becomes a chunk of volume. etc.).

But it's all "just" the dumbest possible multiplication plus v*v = v⋅v.

[0] Proof: (i+j)^2 = i^2+ij+ji+j^2 = 1+ij+ji+1 = 2+ij+ji. But also (i+j)^2 = (i+j)⋅(i+j) = i⋅(i+j) + j⋅(i+j) = 2. So 2 = 2+ij+ji, so ij=-ji.

Nevermark
1 replies
16h54m

In your proof, doesn't i⋅(i+j) + j⋅(i+j) = (1+ij) + (ji+1) = 2 + ij + ji, not 2?

What am I missing? I think ij = -ji is an independent axiom.

ndriscoll
0 replies
16h36m

It's dot products there, which are also distributive. So i⋅i + i⋅j + j⋅i + j⋅j = 1 + 0 + 0 + 1 = 2.

We got dot products from the fact that v^2 = v⋅v for any vector v (so in particular, i+j). Then dot products are linear so you can expand that out. So basically the proof compares using FOIL on geometric products to FOIL on dot products.

Note that `ij` is not a vector; it's a tensor (or "multivector" and in particular a "blade" in the GA lingo). The dot product reduction only applies to vectors from the original space. But i, j, and i+j are vectors.

For simplicity and practically this is all over real vector spaces. You can make similar definitions with other fields, but you have to be careful when working mod 2.

For simplicity I'm also using an orthonormal basis. You also need to be a bit more careful if working with e.g. special relativity where your timelike basis vector t has t⋅t=-1 (though you can see things still work).

ajkjk
0 replies
14h6m

Pedantic nitpick: it's not the dumbest possible multiplication; it's the dumbest possible bilinear multiplication. There are "freer" free products on vectors than tensor product; the freest possible one is their Cartesian product as sets, which just makes a tuple out of them ij = (i, j). If you regarded the resulting sets as a vector space, it would not be true that i0 + 0j = (i, 0) + (0, j) ≠ (i, j). The tensor product is the freest that is well-behaved as a vector space in the sense that it respects the underlying scalar operations from its arguments.

simpaticoder
2 replies
22h0m

In another comment, they pointed out https://youtu.be/htYh-Tq7ZBI?si=lOmsCL2DoqUCQgh1&t=1540 in which Freya did an excellent job reducing the number of axioms to 1: If you define multiplying a vector by itself to be equal to the length of the vector, squared, then everything else falls out of simple polynomial multiplication. It's quite lovely.

nh23423fefe
1 replies
19h48m

"contraction axiom" for those looking to google

xelxebar
0 replies
13h18m

Or "Clifford algebra" for the generic mathematical structure.

ww520
1 replies
15h43m

The second term is not a cross product. It's the exterior product (or bivector). Cross product only works in 3D. Exterior product can work in any higher dimension.

A cross product of two 3D vectors is another vector, perpendicular to the plane containing the two vectors. The exterior product is a 2-vector (thus bivector) that sweeps the parallelogram between the two vectors; it's on the plane containing the two vectors. In 3D, the cross product vector is perpendicular to bivector plane.

lupire
0 replies
2h35m

The exterior product is the "correct" version of the cross product. I'm OK with the good guys seizing the name.

Cross product works in dimension 2^k-1 for k in 0,1,2,3. (real / singleton, complex / duonion, quaternion, octonion) It is trivial in 0D and 1D, defined uniquely in 3D, and unique up to some arbitrary negations in 7D.

https://en.m.wikipedia.org/wiki/Seven-dimensional_cross_prod...

cassepipe
0 replies
11h8m

Thanks for sharing that intuition

nox101
11 replies
23h28m

GA seems great! But ...

and modern formats like Khronos' glTF use quaternions for all their rotation needs. Fantastic for animations, and generally considered worth the cost of the unavoidable conversions to and from matrices.

Quaternions are bad for animation. Animate a clock going from 9am on Monday to 6pm on Friday. Euler angles this might be expressed as from 0 degrees to 1620 degrees. With Quaternions, nope. This can't be expressed in gLTF. It can be in Unreal an Unity, both of which default to use Eular for animation. In gLTF you're required to bake it into smaller turns, all less than 180 degrees.

hamish_todd
4 replies
21h30m

Unity uses quats in its object transforms?

nox101
3 replies
20h21m

But not in it's animation data, at least not by default

You key euler angles, it lerps euler angles for animation values, it then generates a quaternion the given the current value of the euler angles. Same for Unreal AFAIK

hamish_todd
2 replies
20h16m

Where in the docs does it say that? Euler angle lerps, in the general case (eg none of the starts or ends are 0), look like complete shit.

You need the log quat representation. Exp( Tlog(q1/q0) ) q0

hamish_todd
0 replies
9h52m

From that page: "When Euler Angles interpolation is used, Unity internally bakes the curves into the Quaternion representation used internally. This is similar to what happens when importing animation into Unity from external programs"

enkimute
2 replies
23h15m

For specifying animations, you should work in the quaternion lie algebra, not in the group as you suggest. There you can represent 1620 degrees without any problem. Furthermore, in the quaternion Lie algebra (pure imaginary quaternions), and only in that space, you can take an arbitrary rotation key, multiply all 3 of its values with 10 and get 10 times that rotation without change in axis.

If you rotate around just one axis, the Lie algebra feels just like Euler angles .. in fact its exactly the same thing, but if you rotate around more than one .. it keeps working intuitively and usably - Euler angles absolutely do not.

hn8305823
1 replies
23h10m

Also, the use case for quaternions depends on how many times you will be applying the same rotation. If it's a few or dozens of times then maybe not the most efficient. If it's million or billions then you are going to want to use quaternions.

This is mainly due to the cost of converting to and from the rotation vector.

xeonmc
0 replies
21h23m

i.e. keep it in vector form if you're combining them a lot, convert to polar form when you want to work with angles

bluescrn
1 replies
17h57m

Most animations have more than 2 keyframes per week

nox101
0 replies
10h27m

There are lots of examples of a spinning clock showing time passing quickly. Lots of other things spin as well.

There's a reason Unity and Unreal (and Maya, and Blender, and 3DSMax) all use euler angles as their default animation representation.

hgomersall
0 replies
10h31m

In exponentiated form, a GA rotor can be specified to spin as many times as you like, rotating continuously in the plane by the angle you specify. Think of rotations in the complex plane.

contravariant
9 replies
23h16m

To be honest I've never really liked how GA results in all kinds of mixed elements if you're not careful what you multiply with what. Requiring up to 2^n terms for what was an n-dimensional space seems a bit hard to deal with.

It seems like it should be better able to deal with geometry (i.e. inner products), but I've never really found a good argument why you wouldn't just use the wedge product and the hodge star (or musical isomorphisms).

Even something 'magic' like turning a bivector "u^v" into a rotation in that plane "e^(u^v)t" is essentially just using the musical isomorphism to turn the 2-form u^v into a linear automorphism, allowing you to make sense of "e^(u^v)t" as a matrix exponential.

Another example that often gets mentioned is the ability to turn maxwell's equations into a single equation, but since the use of differential forms already makes it possible to summarize it into two equations which hold for very different reasons I never understood the utility of combining them into one equation.

rhelz
6 replies
22h51m

// Requiring up to 2^n terms for what was an n-dimensional space//

Sometimes, the economy is illusory, e.g. normal vectors transform differently than position vectors do. Sure, you can, if you want, use the same data structure to represent both of them, but you'll still have to have some way of keeping track what kind of vector it is holding, as well as sprinkle special cases throughout your code to handle each one differently.

GA, takes the bull by the horns by having vectors use one basis (i,j,k) for vectors, and another basis (j*k, k*i, i*j) for the other.

// never understood the utility of combining them into one equation //

This is a good example of how having a higher-dimensional space actually gives you better economy of storage than a lower dimensional space does: one equation is better than two, or four :-)

And electric fields are different from magnetic fields in quite the same way as vectors are different from bivectors. You can either "special case" them by using a different equation for Electric and Magnetic fields, or you can treat them uniformly with one.

chombier
3 replies
22h9m

A somewhat simpler way of keeping track in the case of normals is to use row vectors for, well, covectors, which is what normals are anyways.

What GA brings is the ability to express linear combinations of scalars, vectors, bi-vectors ... Whether this is actually useful/desirable in practice is another story though.

rhelz
1 replies
21h41m

Yeah, but the original commenter's objection was it seems weird to, e.g. use a 6-dimensional space to represent 3-dimensional quantities.

Doing it by using vectors and covectors still requires you to keep track of 6 degrees of freedom, i.e, 6 dimensions. Eventually everybody has to pay the piper :-)

chombier
0 replies
29m

Yes, you need to keep track of which is which (most likely using the type system) but you don't risk adding vectors to covectors without converting explicitly. Each of vectors/covectors is 3 dimensions, but there is no 6-dimensional space in which vectors/covectors are allowed to mix.

IIUC this is unlike GA/exterior algebra where scalars/vectors/bi-vectors/... can be added together, just like one can add a scalar to a pure imaginary quaternion in the quaternion algebra.

jacobolus
0 replies
20h52m

The #1 thing that GA brings is the ability to divide by vectors, which makes working many things out on paper dramatically simpler.

contravariant
1 replies
19h25m

And electric fields are different from magnetic fields in quite the same way as vectors are different from bivectors. You can either "special case" them by using a different equation for Electric and Magnetic fields, or you can treat them uniformly with one.

What irks me is that the magnetic part of the Maxwell equations is 0 for geometrical reasons, whereas the electrical part is 0 for physical reasons (roughly speaking the curvature of the potential is proportional to the current). Putting them in one equations makes it seem as if you could have something other than 0 on the magnetic side, which is impossible without fundamentally changing the topology of spacetime.

Treating them uniformly is a mistake in my opinion.

adrian_b
0 replies
9h23m

You think this way because you do not pair in the right way the Maxwell equations. The divergence equations are not paired. The same for the curl equations. Pairing the equations in the wrong way can actually lead to errors in the relativistic case.

Two equations are intrinsic properties of the electromagnetic field because it is derived from a potential, i.e. the null condition for the divergence of the magnetic field and Faraday's law of induction.

The other two equations are what you call "physical", i.e. they show the relationship between the electromagnetic field and its sources, i.e. electric current and electric charge.

Alternatively, if you use potentials to describe the electromagnetic field, which is better in my opinion, you just have the relations between potentials and their sources (together with the conservation law for the electric charge). With potentials, you can get rid of the "geometrical" relations (though the choice of potentials is not unique).

hamish_todd
1 replies
21h42m

The mixed elements are the important ones!

A quaternion with w=1, x,y,z=0 is just the identity.

A quaternion with w=0, x=1, or perhaps w=0, x=y=0.7, those would only ever be rotations by 180 degrees.

If you want arbitrary rotations, you need some combination of the two: "a little bit of 180 around this line, and a little bit of 0deg rotation/identity". That's what it means to have scalar and bivector.

If you "being careful" with wedge and inner to avoid mixtures you are doing it wrong. Geometric product is the boss, and makes excellent mixtures!

HelloNurse
0 replies
8h29m

These "mixed" elements are natural types and special cases of different objects and operations that are usually confused (e.g. "vectors" of three numbers to represent points and actual 3D vectors) or contorted (e.g. quaternions to represent rotations) or idiosyncratic (e.g. 3D cross product) in more traditional approaches.

blt
9 replies
1d1h

if the author is reading this: please define the acronym PGA when first using it!

at_compile_time
4 replies
1d

Projective geometric algebra for anyone wondering. A null basis vector is added to the basis vectors of the space you're working in. This allows the algebra to represent geometric objects that do not pass through the origin.

lupire
1 replies
1d

This called "affine transformation" in linear algebra language. (Linear algebra is stretches and rotations. Affine (affinity?) adds translations)

https://people.computing.clemson.edu/~dhouse/courses/401/not...

In 2 dimensions:

Rotation = multiplying by an imaginary unit.

Stretches = multiplying by a real number

Translation = adding a complex number.

In higher dimensions, the analogy to complex numbers breaks down.

xeonmc
0 replies
21h33m

It is not affine transforms per se but rather the expansion into homogeneous coordinates that enables translation by treating it as if it's a shear that leaves the reciprocal dimension untouched.

Rotation = multiplying by an imaginary unit.

This is also not quite right.

Rotation is multiplying by a complex number with a magnitude of 1 (or perhaps you meant to say "raising a number to the power of i"?)

epistasis
0 replies
1d

I've been working through a bunch of Geometric Algebra on the web and YouTube lectures in recent weeks. Though I guessed Projective Geometric Algebra, I still wasn't certain as it's the first time I can recall seeing the acronym!

Joker_vD
0 replies
6h38m

But... aren't the traditional 4x4 transformation matrices already use projective space, essentially?

guhcampos
2 replies
22h5m

Yes! The use of FPGA for "Fast PGA" was particularly confusing.

enkimute
1 replies
20h56m

Apologies. Had that joke sitting around for waaaay to long. Not that great in retrospect :D

girvo
0 replies
14h15m

I got a giggle out of it!

enkimute
0 replies
23h32m

done. mea maxima culpa.

zoogeny
6 replies
1d

One of my favorite math/graphics YouTube creators Freya Holmér did an excellent intro to Geometric Algebra not that long ago [1]. If you have any interest in 3d graphics (especially but no limited to splines/Bezier curves) then be sure to check out all of their videos.

I personally have always struggled with linear algebra and I tend to find these Clifford Algebra approaches much more intuitive.

1.https://www.youtube.com/watch?v=htYh-Tq7ZBI&ab_channel=Freya...

karmakaze
2 replies
22h47m

This is who I thought this was going to be. I enjoyed this along with the Splines & Beziers ones. Such great presentation, never feels rushed but gets to the point.

plagiarist
0 replies
19h21m

I knew the name looked familiar. Splines was a fantastic video.

simpaticoder
0 replies
22h4m

What a wonderful talk, thanks. It reminded me of https://enkimute.github.io/ganja.js/ which is actually a library by enkimute, the OP! (It's quite a remarkable library too, by being a single file, no-build script that supports N dimensional algebras along with render support.)

Quekid5
0 replies
22h5m

I'd like to point out that the YT comments have some good (weird for YT, I know!) clarifications and questions about bits where she did go a bit fast or skip over things, e.g. the non-commutativity of products (in general) and such.

Great video.

3abiton
0 replies
18h52m

I never thought maths was that cool.

corysama
4 replies
1d1h

It's fun that there have been many approaches to interpolating rotations (geometric algebra, quaternions, even full-matrix interpolation [1]). But, after hand-optimizing the code, the final code ends up mostly the same for all approaches. The difference is in your understanding of the rules and capabilities.

From what little I know, GA seems like the most consistent and capable approach. It's unfamiliar. It's a bit much to take in getting started. But, people who clear that hurdle love it.

Alternatively, everybody uses quaternions while complaining they don't understand them and need a whole book to visualize them. (Visualizing Quaternions by Andrew J. Hanson, Steve Cunningham)

[1]https://www.gamedev.net/tutorials/programming/math-and-physi...

epistasis
1 replies
1d

I'm not a mathematician, and don't have a ton of use for geometry in my work, but was learning GA for fun, and have similarly tried to learn quaternions in the past. GA is fun, quarternions are not fun. I think I understand GA, but I knew I did not understand quaternions after working through lectures and problems. Now that I know some GA, I kind of feel like a I know quaternions, finally.

segfaultbuserr
0 replies
12h30m

I think I understand GA, but I knew I did not understand quaternions after working through lectures and problems.

Most physicists stopped using them at the end of 19th century for the same reason...

More than a third part of a century ago, in the library of an ancient town, a youth might have been seen tasting the sweets of knowledge to see how he liked them. He was of somewhat unprepossessing appearance, carrying on his brow the heavy scowl that the "mostly-fools" consider to mark a scoundrel. In his father's house were not many books, so it was like a journey into strange lands to go book-tasting. Some books were poison; theology and metaphysics in particular they were shut up with a bang. But scientific works were better; there was some sense in seeking the laws of God by observation and experiment, and by reasoning founded thereon. Some very big books bearing stupendous names, such as Newton, Laplace, and so on, attracted his attention. On examination, he concluded that he could understand them if he tried, though the limited capacity of his head made their study undesirable.

But what was Quaternions? An extraordinary name! Three books; two very big volumes called Elements, and a smaller fat one called Lectures. What could quaternions be? He took those books home and tried to find out. He succeeded after some trouble, but found some of the properties of vectors professedly proved were wholly incomprehensible. How could the square of a vector be negative? And Hamilton was so positive about it. After the deepest research, the youth gave it up, and returned the books. He then died, and was never seen again. He had begun the study of Quaternions too soon.

- Oliver Heaviside, Electromagnetic Theory

spenczar5
3 replies
23h38m

Are these algorithms efficient even given GPUs? I have the vague impression that GPUs are well-tuned for matrix work. Are those advantages lost when using Geometric Algebra formulations, so you actually dont come out ahead?

This is uninformed speculation, go ahead and correct me!

rhelz
0 replies
21h54m

When you are programming, you have to figure out:

1. What quantity you want to calculate, and 2. What the most efficient way to calculate it is.

PGA (once you spend the--alas--not insubstantial overhead to understand it!) is a really good way of doing #1. Its virtually always a good idea to first try out the simplest and easiest to code up implementation anyways.

And what you get from using PGA to do #1 will certainly be good enough for you to prototype out the rest of your program enough to be able to benchmark it and find out where the real bottlenecks are. Happily, in most cases it will also either be the fastest way to calculate it, or close enough to not be the bottleneck.

And if is a bottleneck, it gives you a deep understanding of the problem you are trying to solve--which, IMHO, is a good idea to have before you just start trying to shave off cycles in hopes of getting it fast enough.

hamish_todd
0 replies
21h40m

It's an extremely common misconception that because GPUs have matrix matrix and matrix vector products in the standard, that means GPU companies must be accelerating them.

In fact, because it is SIMD across the shader cores already, you can't necessarily do this. Some GPUs do, some don't

buildartefact
0 replies
22h49m

This is exactly what the article is about. TLDR they can be roughly equivalent

e4m2
0 replies
22h13m

The author of that talk, Eric Lengyel, also wrote the book "Foundations of Game Engine Development, Volume 1: Mathematics". Its 4th chapter focuses on the same topics.

lawrenceyan
2 replies
22h32m

This gives me PTSD

DrDroop
1 replies
21h53m

What! Why?

lawrenceyan
0 replies
6h7m

Working on point transformations. Not that bad, but still a bit of a pain.

ebolyen
2 replies
1d1h

The interpolation of animations at the bottom is really neat, but I can't help but wish the models were a little less _active_ on the rest of the page. Math is plenty hard without a small elephant cheerleader.

orangesite
1 replies
1d

Au contraire my friend, if it were not for the elephantine encouragement I would not have made it to the end of the page! <3

xeonmc
0 replies
18h36m

The come-hither looks were just a little bit distracting, though.

pasabagi
1 replies
22h39m

What a great article! Not an area of special interest of mine, but the piece was a joy to read.

enkimute
0 replies
20h17m

Thank you, appreciate that!

spintin
0 replies
5h36m

This is hair splitting at the end of progress:

The fact that 3D skeletal animation is still using 4x4 matrices in the GPU means the math developed for this around Half-Life 1 (on CPU?) is still the bleeding edge. 1998 -> 2024 = 26 years!

In 1000 years 3D animation will still be the same. End of story.

nimish
0 replies
18h23m

Someone ought to do a full Lie representation theory explanation of graphics operations.

andai
0 replies
1h10m

This article goes over my head, but the title reminded me of my experiments writing simple 3D renderers. After several failed attempts to learn linear algebra, I had the shower thought that a 3D rotation is just three 2D ones, and that I already know how to do those. Within an hour or so I had a wireframe 3D renderer, perspective and all!

I encourage everyone to try it.

__xor_eax_eax
0 replies
1h37m

Was that first paragraph even english? Man thats thick