Epistemic uncertainty, also known as model uncertainty, is the reducible component of a model's predictive uncertainty that stems from a lack of knowledge, typically due to insufficient or unrepresentative training data, inadequate model capacity, or encountering out-of-distribution inputs. It is distinguished from aleatoric uncertainty, which is the irreducible noise inherent in the data itself. In agentic cognitive architectures, quantifying epistemic uncertainty is critical for triggering safe behaviors like seeking clarification, deferring to a human, or exploring new information.
