Conceptually, for two vectors x and y, x.y is defined as magnitude of a multiplied by projection of y onto x (think of it as shadow cast by y onto x) if the x and y are at right angles (orthogonal), x.y will be zero, regardless of the length of either of them Complex...
Vectors
They have two properties 1. Magnitude (length), 2. Direction From a computer science perspective, it’s just an ordered list of numbers Vectors can be added, multiplied (=scaled) A key operation on vectors is the dot product. Conceptually the dot product a.b is defined...
Layers
Layers = groups of perceptrons (See [[Perceptron]]) stacked so that each layer’s outputs become the next layer’s inputs, letting the network learn increasingly abstract features. Stacked/layered perceptrons create a feed-forward network (ie going from an input to an...
Perceptron
Perceptron = the simplest artificial neuron that takes multiple inputs, multiplies them by weights, adds a bias, and turns the result into a yes/no (or score) output. Inputs: a vector of features (x1,x2,...,xn) Parameters: One weight per input (w1,w2,...,wn) An...
Redesigning Apprenticeship for the AI Era
I first heard Ethan Mollick on The Ezra Klein Show in April 2024 (“How Should I Be Using A.I. Right Now?”). He offered sensible, practical ways to use AI without the hype. Shortly after, I read Co-Intelligence and have followed his writing and talks since. In a recent...
Faster writing, different thinking with AI Voice Dictation
AI voice dictation is having a moment. These tools do more than transcribe—they read context, add punctuation, and learn your style. Many creators say they work two to three times faster. Two weeks ago I started using Wispr Flow Pro. Here is what I found. The Good...
Metaprompting
Dharmesh's post made me realize there’s a name for something I’ve been doing implicitly for a while—using AI to help me write better prompts. Strictly speaking, that’s AI-assisted prompt refinement. There’s a closely related idea called metaprompting—writing prompts...
Four Weekends Building Munshi: Notes on Product Thinking and AI Development
It happens to most of us multiple times a week: someone emails asking for a good time to meet, and before you know it, you're stuck in a back-and-forth scheduling spiral. It's a mundane friction that adds up. Four weekends ago, I decided to vibe-code my way to a...
Think about local minima in thousands of dimensions
When I first learned about Gradient Descent about two years ago, I pictured it in the most obvious 3D way - where one imagines two input variables (as x and y axis in a 2D plane) and the loss being the third (z) axis. In terms of 'local minima' I imagined it as the...
How Diffusion Models Power AI Videos: An Incredible Visual Explanation
I first wrapped my head around diffusion models in 2023, thanks to MIT 6.S191 Lecture on 'Deep Learning New Frontiers'. The idea of reverse-denoising just clicked for me—it reminded me of how our brains pick out shapes and objects in clouds or random mosaics....