Simple Linear Regression — OLS Visualized
Click to add points and watch ordinary least squares snap the regression line into place. Residuals, slope β₁, intercept β₀, and R² — all tangible, not algebraic.
M.01 / SIMPLE REGRESSION
Simple Regression (OLS)
Up to here, one variable at a time. Real questions involve relationships — height vs. weight, ad spend vs. sales. Simple regression draws one line through two variables — and the t-tests and CIs you just learned power the inference on its slope β̂.
Regression with just one explanatory variable is simple regression. It assumes a linear relationship: when x increases by 1, y moves by β₁. Ordinary least squares (OLS) picks the line that minimizes the sum of squared vertical residuals. Click the canvas to add points and watch the line snap into place. Green bars are residuals. R² (in 0–1) measures how much of y the line explains.
ŷ = β₀ + β₁x , β₁ = Σ(xᵢ−x̄)(yᵢ−ȳ) / Σ(xᵢ−x̄)²
↑ Click the canvas to add points
n0
Slope β₁—
Intercept β₀—
R²—
Correlation r—