AI hallucinations in public services are high-stakes failures. When a large language model (LLM) like GPT-4 fabricates a rule or misinterprets a policy, it can wrongfully deny critical housing, food, or healthcare assistance. This is a direct violation of administrative law and due process, exposing agencies to legal action and eroding the social contract.














