Bias–Variance Tradeoff Explorer

See underfitting vs overfitting with polynomial regression on a noisy sine curve. Compare training vs validation error and find the “sweet spot”.


Noisy Sine Data & Polynomial Fit
Train • Val
Blue dots = train, purple dots = validation, black line = true sine, orange line = fitted polynomial.
Train MSE
Val MSE
Degree ↑ → more flexible (risk overfit). Lambda ↑ → smoother (less variance).
Training vs Validation Error
Sweet spot = lowest validation error
Hover to see per-degree errors. Vertical line marks the degree with minimal validation error.
📦 Data
Train size60
Val size40
Noise σ0.20
🤖 Model (Polynomial Regression)
Degree5
Lambda (L2)0.00
⭐ Sweet Spot
Best degree:
Train/Val MSE at best: /

Underfitting vs Overfitting: find the sweet spot
How to use
  1. Click Generate to sample a new train/validation set from a noisy sine.
  2. Move the Degree slider to change model complexity; Lambda adds smoothing.
  3. Click “Compute Curve” to plot Train vs Val error across degrees; the “Best degree” marks the sweet spot.
What to observe
  • Low degree: high bias → both Train and Val errors are large (underfit).
  • High degree: low bias but high variance → Train error drops, Val error rises (overfit).
  • Middle range: Val error is lowest → best generalization.
Tips
  • Increase Noise to make the task harder; the sweet spot shifts to smaller degrees.
  • Increase Lambda to smooth the fit and reduce variance at high degrees.
  • Change Train/Val sizes to see how more data stabilizes validation error.
Search terms: “bias variance tradeoff visual”, “underfitting vs overfitting demo”, “polynomial regression visualization”.

Support This Free Tool

Every coffee helps keep the servers running. Every book sale funds the next tool I'm dreaming up. You're not just supporting a site — you're helping me build what developers actually need.

500K+ users
200+ tools
100% private
Privacy Guarantee: Private keys you enter or generate are never stored on our servers. All tools are served over HTTPS.