A temporal embedding is a high-dimensional vector that encodes not just the semantic content of a data point but also its position, evolution, or relational context within a time-ordered sequence. Unlike static embeddings, these vectors are generated by models like Temporal Convolutional Networks (TCNs), Recurrent Neural Networks (RNNs), or time-aware transformers, which process sequential inputs. This allows for similarity searches that consider both meaning and temporal proximity, a core requirement for agentic memory systems that reason over event streams.
