S30 Logo
S30 AI Labwww.thes30.com
Back
#4

Linear Algebra for ML

Easy📐 Math for MLW2 D1

Linear Algebra for ML

Build interview-grade intuition for vectors/matrices, norms, projections, eigenvalues, and SVD — and connect them to ML (least squares, PCA, embeddings).

Students can compute linear algebra quantities in NumPy, reason about shapes, and explain *why* each concept matters in ML systems.

Progress — 0/6 tasks

1Tasks
2Vectors, Dot Product, Norms
3Matrix Multiplication + Shapes
4Projections (Least Squares Intuition)
5Least Squares (Closed Form)
6Eigenvalues & SVD Intuition

Interview Angles

  • Projection is “closest point in a subspace” → least squares intuition.

FAANG Gotchas

  • Most ML bugs are silent shape bugs—assert shapes.
  • Avoid explicit inverse; solve is more stable and faster.

Asked At

GoogleSquareGitHub
Python 3 — Notebook
0/6 solvedSubstack Notes
1
Dataset & Setup

Linear Algebra for ML — FAANG-Level Lab

Goal: Build shape intuition + compute core linear algebra objects used in ML.

Outcome: You can implement least squares, projections, eigen decomposition intuition, and SVD-based PCA.

Loading editor...
Solution
1

Vectors, Dot Product, Norms

2
Implement dot + L2 norm (no np.linalg.norm)
2

Section 1 — Vectors, Dot Product, Norms

Task 1.1: Implement dot + L2 norm (no np.linalg.norm)

  • dot(x,y) = sum(x_i * y_i)
  • ||x||_2 = sqrt(dot(x,x))

Explain: What does dot product measure geometrically?

Explain: What does dot product measure geometrically?
Loading editor...
Solution
2

Matrix Multiplication + Shapes

3
Validate shapes and compute A@B
1

Section 2 — Matrix Multiplication + Shapes

Task 2.1: Validate shapes and compute A@B

Given A (n,d) and B (d,k) compute C (n,k).

FAANG gotcha: Many bugs are shape bugs. Always assert shapes.

Loading editor...
Solution
3

Projections (Least Squares Intuition)

4
Project vector v onto vector u
1

Section 3 — Projections (Least Squares Intuition)

Task 3.1: Project vector v onto vector u

proj_u(v) = (u^T v / u^T u) * u

  • Use your dot()

Explain: Why does projection show up in linear regression?

Explain: Why does projection show up in linear regression?
Loading editor...
Solution
4

Least Squares (Closed Form)

5
Solve min_w ||Xw - y||^2
1

Section 4 — Least Squares (Closed Form)

Task 4.1: Solve min_w ||Xw - y||^2

Use normal equation: w = (X^T X)^{-1} X^T y

  • Use np.linalg.solve (more stable than explicit inverse)

FAANG gotcha: Don’t compute matrix inverse unless you must.

Loading editor...
Solution
5

Eigenvalues & SVD Intuition

6
PCA via SVD
1

Section 5 — Eigenvalues & SVD Intuition

Task 5.1: PCA via SVD

Steps:

  1. Center X
  2. Compute SVD: X = U S V^T
  3. Take top-k components from V
  • U, S, Vt = np.linalg.svd(X_centered, full_matrices=False)

Explain: Why does SVD show up in embeddings and dimensionality reduction?

Explain: Why does SVD show up in embeddings and dimensionality reduction?
Loading editor...
Solution

Need help? Share feedback