Monte Carlo Dropout is a technique that treats dropout—a regularization method typically used only during training—as an approximate Bayesian inference procedure. By applying dropout stochastically during multiple forward passes at inference time, the network generates a distribution of predictions. The variance of this distribution quantifies the model's epistemic uncertainty, indicating its confidence or lack of knowledge about a given input. This provides a computationally efficient alternative to training full model ensembles.
