Math Lab (Linear Algebra + Probability)
Consolidate Week 2 with derivations + short computational checks.
Students can derive key results, verify with small code experiments, and communicate clearly under interview constraints.
Progress — 0/6 tasks
Interview Angles
- • Where does PSD show up in ML (covariance, kernels)?
- • How does PCA relate to embeddings and compression?
FAANG Gotchas
- • Avoid matrix inverse; use solve/lstsq.
Asked At
Math Lab — FAANG-Level Mixed Problems
This lab is a problem set + mini-verification notebook covering projection matrices, PSD checks, least squares, Bayes intuition, and PCA links.
Projection Matrix Properties
Problem 1 — Projection Matrix Properties
Let P be a projection matrix onto a subspace.
- ●Show that P^2 = P (idempotent).
- ●For orthogonal projection, show P = P^T.
Task 1.1
Construct projection onto span(u) and verify both properties.
PSD Matrix Check
Problem 2 — PSD Matrix Check
Show that for any matrix X, A = X^T X is positive semidefinite (PSD).
Task 2.1
Sample random X, build A, and verify v^T A v >= 0 for many random vectors v.
Least Squares Derivation
Problem 3 — Least Squares Derivation
Derive the normal equations for minimizing ||Xw - y||^2.
Task 3.1
Compare w_hat from solve vs np.linalg.lstsq.
Bayes + Base Rate
Problem 4 — Bayes + Base Rate (Derivation)
Re-derive P(D|+) for the disease test scenario and explain the base-rate fallacy.
Task 4.1
Compute analytic posterior and verify with simulation.
PCA Link
Problem 5 — PCA Link
Explain why PCA components are eigenvectors of the covariance matrix.
Task 5.1
Compute covariance eigenvectors and compare with SVD directions.