Temporal granularity is the level of detail or resolution at which time is measured, represented, and processed within a computational system. It defines the smallest discrete unit of time used to timestamp events or data points, such as nanoseconds, seconds, days, or years. The chosen granularity directly impacts memory storage efficiency, query performance, and the system's ability to reason about sequences and causality. For instance, high-frequency trading systems require millisecond granularity, while supply chain forecasting may operate at daily or weekly resolution.
