Zero-Shot Chain-of-Thought (Zero-Shot CoT) is a prompting technique that elicits step-by-step reasoning from a language model without providing any task-specific examples in the prompt. It typically works by appending a simple, generic instruction like 'Let's think step by step' to a user query, which triggers the model to decompose the problem and articulate its intermediate logical or computational steps before delivering a final answer. This approach leverages the model's internal knowledge and reasoning capabilities learned during pre-training, making it a flexible and efficient method for improving performance on complex reasoning tasks without curated demonstrations.
