Compare activation functions used in deep learning. Plot Sigmoid, Tanh, ReLU, Leaky ReLU, ELU, and GELU on the same chart; toggle derivatives and tune parameters interactively.
Every coffee helps keep the servers running. Every book sale funds the next tool I'm dreaming up. You're not just supporting a site — you're helping me build what developers actually need.
This explorer computes activation values and derivatives directly in your browser and plots them using Chart.js. Functions include Sigmoid, Tanh, ReLU, Leaky ReLU (α configurable), ELU (α configurable) and GELU (exact and tanh approximation). Inputs are clamped to avoid overflow in exp‑based functions for numerical stability.