I strongly suggest anyone getting into Linear Algebra to have a project to work on. It makes everything so much easier when you get to play with the stuff.

My hint for something to play with is that basic linear algebra applies very directly to graphics, rotation matrices and so on. If you know how to multiply a matrix with a vector, you basically know what you need to render basic line-art 3D graphics. May want to look into dot and cross products as well as vector projection, but it's fairly basic all of this.

Deep learning! It's all "just" (more or less) high school calculus (partial derivatives, chain rule) and matrix multiplication.

I feel like I saw one once but lost it -

Is there a githubrepo/tutorial for how linear algebra is used for a very small model just to demonstrate how that allows it to "learn"?

I've got the calc, I just don't understand what the matrix multiplication "does"

Watch Karpathy's recent lectures. They're gold. Start here[1] with micrograd[2]. It doesn't use linear algebra/matrices to start, but the principles are the same. The matrix multiplication is how the weights of the connections between neurons and the input values are combined (to form an activation value that then may lead to that neuron "firing" or not, depending on whether it passes some threshold function). We use matrices to model the connections between neurons - each row is a connection, and each column is a weight corresponding to an input.

[1] https://www.youtube.com/watch?v=VMj-3S1tku0 [2] https://github.com/karpathy/micrograd