Overfitting
Fitting a model or strategy so closely to historical data that it fails out-of-sample.
Definition
Overfitting is when a strategy has too many parameters or is tuned too specifically to past data, so that in-sample performance is strong but out-of-sample (or live) performance degrades.
Why it matters
- Backtest Sharpe of 2.0 can be overfitted; live might be 0.5 or negative.
- More parameters and more optimization increase overfitting risk.
Common mistakes
- Optimizing lookbacks, thresholds, and filters on the same data used to report results.
- Using many strategies and only reporting the best (selection bias).
- Ignoring multiple testing and degrees of freedom.
Mitigations
Walk-forward, out-of-sample holdout, fewer parameters, regularization, and economic rationale for each choice.