It bothers me that the data science/computer science space thinks that tensors and “three dimensional matrices” are the same thing, because that’s just straight up wrong.
A tensor is just a generalized multlinear mapping. “3D matrices” can be tensors, but tensors can also be matrices, dot products, etc
*stapling a page labeled ‘TENSOR’ to the page labeled ‘MONAD’ to the list of words computers people stole from math to sound smart without really understanding the concept*
One thing about math is that it takes all those lessons from elementary school about how you can’t define a word by using that word, and punches that rule in the nose.

A vector is an element of a vector space.

https://twitter.com/passcod/status/1293790503423447041?s=21 https://twitter.com/passcod/status/1293790503423447041
You might think, “no Emily, a vector is a tuple of numbers or objects” and I’m here to tell you you’re wrong.
A vector space V is a space defined over a field F and is equipped with two operations: scalar multiplication and vector addition, under which it is closed.

That is, for a in F and v in V, av is also in V.

For v, w in V, v+w in V.

This is not all, though.
A vector space must have a zero vector. The elements of any vector ought to come from F. It must also have an inverse operation and an identity operation:

-v in V such that v + -v = 0 and 1v = v
And finally it must have scalar distributivity a(v+w) = av + aw and associativity (u + v) + w = u + (v + w)

The fact that we have vectors that are tuples of numbers is mostly a consequence of reals forming a field
You can follow @EmilyGorcenski.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: