S30 Logo
S30 AI Labwww.thes30.com
Back
#1

NumPy Lab

Easy๐Ÿ Python & DataW1 D2

NumPy Lab

Build deep intuition for NumPy internals, vectorization, and performance โ€” the way FAANG expects ML engineers to think.

You can write fast, memory-efficient, interview-ready NumPy code and explain *why* it is efficient.

Progress โ€” 0/15 tasks

1Tasks
2ndarray Fundamentals
3dtype & Memory
4Indexing, Views & Copies
5Boolean Masking
6Broadcasting
7Broadcasting Trap
8Vectorization vs Loops
9Numerical Stability
10Linear Algebra
11Performance & Memory
12Mini Case Study

Asked At

Google (TensorFlow, Research)Meta (PyTorch, Computer Vision)Amazon (AWS, SageMaker)Microsoft (Azure ML, Bing)Apple (Core ML, Vision)Netflix (Recommendation algorithms)
Python 3 โ€” Notebook
0/15 solvedSubstack Notes
1
Dataset & Setup

Setup

Section 1 โ€” ndarray Fundamentals

Task 1.1: Array Creation & Shapes

Loading editor...
Solution
1

ndarray Fundamentals

2
Array Creation & Shapes
4

Section 1 โ€” ndarray Fundamentals

Task 1.1: Array Creation & Shapes

Loading editor...
Solution
2

dtype & Memory

3
Compare memory usage
4

Explain:

  • โ—What does .shape represent?
  • โ—Why does contiguous memory matter?

Section 1.2 โ€” dtype & Memory

Task 1.2: Compare memory usage

Loading editor...
Solution
3

Indexing, Views & Copies

4
Views vs Copies
5

Interview Question:
Why does dtype selection matter in large ML pipelines?

Section 2 โ€” Indexing, Views & Copies

Task 2.1: Views vs Copies

Loading editor...
Solution
4

Boolean Masking

5
Boolean masking
4

Explain:

  • โ—Why did the original array change (or not)?

Section 2.2 โ€” Boolean Masking

Task 2.2: Boolean masking

Loading editor...
Solution
5

Broadcasting

6
Broadcasting Rules
6

Section 3 โ€” Broadcasting

Task 3.1: Broadcasting Rules

Loading editor...
Solution
6

Broadcasting Trap

7
Fix a broadcasting trap
2

Explain broadcasting step-by-step.

Section 3.2 โ€” Broadcasting Trap

Task 3.2: Fix a broadcasting trap

Loading editor...
Solution
7

Vectorization vs Loops

8
Loop vs Vectorized
3

What was wrong with the original shapes?

Section 4 โ€” Vectorization vs Loops

Task 4.1: Loop vs Vectorized

Loading editor...
Solution
9
Pairwise Distance (FAANG Classic)
4

Why is vectorization faster?

Task 4.2: Pairwise Distance (FAANG Classic)

Loading editor...
Solution
8

Numerical Stability

10
Softmax
4

Section 5 โ€” Numerical Stability

Task 5.1: Softmax

Softmax converts logits into probabilities:

softmax(z)i=eziโˆ‘j=1Kezj\mathrm{softmax}(z)_i = \frac{e^{z_i}}{\sum_{j=1}^{K} e^{z_j}}

Stable form (subtract max logit in each row):

softmax(z)i=eziโˆ’maxโก(z)โˆ‘j=1Kezjโˆ’maxโก(z)\mathrm{softmax}(z)_i = \frac{e^{z_i - \max(z)}}{\sum_{j=1}^{K} e^{z_j - \max(z)}}
Loading editor...
Solution
9

Linear Algebra

11
Matrix Multiplication
2

Why does subtracting max work?

Section 6 โ€” Linear Algebra

Task 6.1: Matrix Multiplication

Loading editor...
Solution
12
Solving Linear Systems
2

Explain difference between dot, @, and matmul.

Task 6.2: Solving Linear Systems

Loading editor...
Solution
10

Performance & Memory

13
In-Place Operations
3

Section 7 โ€” Performance & Memory

Task 7.1: In-Place Operations

Loading editor...
Solution
14
Strides
3

Task 7.2: Strides

Loading editor...
Solution
11

Mini Case Study

15
Mini case study (NumPy PCA)
4

Section 8 โ€” Mini Case Study

Task 8.1: Mini case study (NumPy PCA)

Loading editor...
Solution

Need help? Share feedback