1. Linear Algebra
Dot Product
Dot products compress two directions of two vectors into one signal: how much they point the same way (their alignment). Read More.
Vectors
Vectors are the simplest building block of LLM geometry. They are how LLMs package meaning: embeddings, activations, and even weights are just numbers in a direction. Read more.
2. Artificial Neuron, Neural Networks
Layers
Layers are “combinations of combinations”: stacking simple units so each stage remixes the last, letting the network build richer and more abstract features step by step. Read More.
Perceptrons
A perceptron is the simplest learnable decision-maker: it combines several numbers into one score and uses that to separate one kind of input from another with a straight line. Read more about what a perceptron is, and it’s comparison to biological neurons.
3. Training
Loss
Gradient descent
Backpropagation
Model training techniques
4. Architectures
RNN
CNN
Transformer
5. Transformer
Embedding
Encoder, Decoder
Attention
6. Evolving Higher Concepts