Data poisoning attacks inject corrupted samples into the training dataset of a biometric model, creating a hidden backdoor that attackers later exploit. This is a first-order security threat for systems using facial, voice, or behavioral recognition for authentication.














