modules
Modules are the reusable pieces of math underneath applications. Sorted by how many applications consume each — the manifesto's modular-graph claim, made data-shaped.
10 modules · 10/10 consumed
- module 7 applicationsVectors
A point says where. A vector says how to move. The same tuple plays four roles — position, displacement, velocity, feature — across graphics, physics, and ML. Two operations (add and scale) carry every one of them.
- module 6 applicationsThe Derivative
A moving point leaves a trail. The derivative is not the trail — it is the arrow the trail wants to become at this instant. Secant slopes converge to tangent slopes, and the same machine becomes slope, velocity, and rate.
- module 5 applicationsThe Logarithm
The trick that turns × into +. The whole module is one equation; everything else is consequence.
- module 5 applicationsDistributions
A probability is one guess. A distribution is the whole shape of uncertainty. The thing softmax produces, the thing a histogram is, the thing a return is drawn from. Most quantities a model predicts or a portfolio holds are distributions before they are numbers.
- module 4 applicationsThe Integral
A speedometer reads. How far have you traveled? The integral is what survives when you sum a rate over time. The pair to the derivative — and the fundamental theorem says they are inverses.
- module 4 applicationsLinearization
Most equations are hard. Their tangent line at a point is easy. Replace one with the other and you get a tool that powers the pendulum clock, Newton's method, and gradient descent — valid in a regime, wrong outside it. The discipline is the regime.
- module 3 applicationsParametric Curves
A curve is not a picture. Three motions can paint the same parabola — same image, different parametrizations. Pin down the distinction the word 'curve' was hiding, and a whole stack of downstream tools snaps into place.
- module 3 applicationsEntropy
20 questions to find any one of N items needs log₂ N when items are equally likely. Entropy is the generalization — the expected number of yes/no questions when they aren't. The bound everything from Wordle to Huffman to password strength bumps against.
- module 3 applicationsOptimization
Optimization is not finding the formula. It is choosing a quantity to improve, then moving through possible choices until improvement stops. The same five steps — objective, search space, move, step size, stopping — run gradient descent in ML, weight selection in portfolios, and temperature fitting in calibration. Different algorithms, identical bones.
- module 2 applicationsBezout's Theorem
Two curves of degrees d, e meet in exactly d·e points — once the plane is repaired three ways. The chord-and-tangent feeds elliptic-curve arithmetic, which feeds Bitcoin signatures.