Linear Algebra for ML
Build interview-grade intuition for vectors/matrices, norms, projections, eigenvalues, and SVD — and connect them to ML (least squares, PCA, embeddings).
Students can compute linear algebra quantities in NumPy, reason about shapes, and explain *why* each concept matters in ML systems.
Progress — 0/6 tasks
Interview Angles
- • Projection is “closest point in a subspace” → least squares intuition.
FAANG Gotchas
- • Most ML bugs are silent shape bugs—assert shapes.
- • Avoid explicit inverse; solve is more stable and faster.
Asked At
Linear Algebra for ML — FAANG-Level Lab
Goal: Build shape intuition + compute core linear algebra objects used in ML.
Outcome: You can implement least squares, projections, eigen decomposition intuition, and SVD-based PCA.
Vectors, Dot Product, Norms
Section 1 — Vectors, Dot Product, Norms
Task 1.1: Implement dot + L2 norm (no np.linalg.norm)
- ●dot(x,y) = sum(x_i * y_i)
- ●||x||_2 = sqrt(dot(x,x))
Explain: What does dot product measure geometrically?
Matrix Multiplication + Shapes
Section 2 — Matrix Multiplication + Shapes
Task 2.1: Validate shapes and compute A@B
Given A (n,d) and B (d,k) compute C (n,k).
FAANG gotcha: Many bugs are shape bugs. Always assert shapes.
Projections (Least Squares Intuition)
Section 3 — Projections (Least Squares Intuition)
Task 3.1: Project vector v onto vector u
proj_u(v) = (u^T v / u^T u) * u
- ●Use your dot()
Explain: Why does projection show up in linear regression?
Least Squares (Closed Form)
Section 4 — Least Squares (Closed Form)
Task 4.1: Solve min_w ||Xw - y||^2
Use normal equation: w = (X^T X)^{-1} X^T y
- ●Use
np.linalg.solve(more stable than explicit inverse)
FAANG gotcha: Don’t compute matrix inverse unless you must.
Eigenvalues & SVD Intuition
Section 5 — Eigenvalues & SVD Intuition
Task 5.1: PCA via SVD
Steps:
- ●Center X
- ●Compute SVD: X = U S V^T
- ●Take top-k components from V
- ●
U, S, Vt = np.linalg.svd(X_centered, full_matrices=False)
Explain: Why does SVD show up in embeddings and dimensionality reduction?