In networking, you have read about latency and jitter. How are the two terms different?
"Jitter is used to describe the amount of inconsistency in latency across the network, while latency measures the time it takes for data to reach its destination and ultimately make a round trip." Taken from https://www.networkmanagementsoftware.com/jitter-vs-latency/.
Clock jitter can create logging discrepancies; it can make some services not work. Correlating events for post-mortems is easiest when there is no clock jitter.
Latency is merely the start to end time of a given network operation. See this https://www.cloudflare.com/learning/performance/glossary/what-is-latency/ for more information.