Tool-Augmented Reasoning is a prompting technique that interleaves a language model's internal Chain-of-Thought process with calls to external tools—such as calculators, code executors, search APIs, or databases—to perform precise operations the model may struggle with. This hybrid approach allows the model to offload specialized tasks like arithmetic, factual lookup, or data retrieval, grounding its reasoning in accurate, verifiable computations and information. Frameworks like ReAct (Reasoning + Acting) and Program-Aided Language Models (PAL) are canonical implementations of this paradigm.
