How computer clocks work and how they are adjusted

6 min read

A month or two ago I was asked by someone in our Operations team what clock synchronisation is and why we need to do it. I gave them a very basic few sentence answer. That got me thinking that I never read an easy explanation when I myself got started in this area, and the terminology used can be confusing if it’s the first time you come across it. Below is a copy-paste out of our internal documentation where I attempt to explain computer clock synchronisation and the reason for it. The explanation is designed at non-technical people, it skips over many intricacies and is “wrong” from certain points of view but hopefully it describes the problem and solution accurately enough.

So here it is…

There is some terminology for time synchronisation that you have to get your head around (like Step, Slew, and Frequency), and to get that you have to understand how a “Clock” is implemented in a computer.

A computer only knows how to work with digital signals. Imagine a small device (called an Oscillator) where if you pass an electrical signal into it, it sends out pulses of electrical signals at constant intervals. If these intervals are small enough, we can approximate the passage of time versus the number of signal pulses.

For the sake of easy mathematics lets assume that 6000 pulses means 1 minute has passed, so that is 100 pulses per second, or 100 Hertz (Hz). If we put one of these devices onto a motherboard and send an electrical signal through it, and count the number of signals coming out, the computer will be able to know how much time has passed.

That’s nice, but we need to know what the current time actually is. Knowing how many seconds has passed is not helpful if I don’t even know what year it is. I can ask a clock that I trust what the real time is and just set my time to what it tells me. This is called a clock STEP - where the clock is forcibly set to a given value. Fantastic, my computer can now draw the correct time on my Desktop Clock.

Now imagine that like all computer components, the Oscillator is mass produced cheaply and never perfect. So even though it’s supposed to send 100 pulses every second, my one actually only sends 90 pulses per second due to a manufacturing discrepancy, so my computer clock ends up running slow (when it counts 100 pulses, more than 1 second of real time has passed). In this case I can’t trust the passage of time on my own machine. I can solve this problem by asking the trusted clock twice. After I step the clock the first time, if I wait what I think is 1 second and ask the same trusted time source what the time is I’ll be able to figure out that I am 1/10 of a second slow, or, my OFFSET or DRIFT is 0.01 seconds. From there I can figure out that I need to count 10 fewer pulses to get to one second. If I now “remember” that I need to count 10 pulses less and step the clock again… Great, I have now compensated for my Oscillator defect and my desktop clock runs at the correct speed.

Now imagine that like all computer components, the Oscillator behaves differently under different conditions even though we don’t want it to. Maybe if it gets hotter it maybe it sends pulses faster, and if it gets colder it sends pulses slower. Or maybe due to more manufacturing faults that even though it is supposed to be a constant pulse, it’s not really.

This means that my +10 frequency adjustment is only correct for a single moment. Since I’ve got no idea how stable the Oscillator pulses are, in order to keep time as accurate as possible I have to constantly be referring to a clock that I can trust and adjust the frequency as needed. Maybe every time I ask it turns out I’m between +0.01 and -0.01 seconds out, so my frequency adjustment range is +/- 10 pulses. I don’t particularly want to step the clock each time though, that’s can be a bit brutal to the programs running on my computer - time would appear to be passing in “jumps”. What I can do though is count more or less pulses to catch me up to where time is supposed to be, so the time will SLEW in the right direction.

For example if I am 0.01 seconds behind after one second passes, if I count 10 fewer pulses I will still be 0.01 seconds behind (I’ve compensated for the drift but not caught up). Instead I’m going to count 20 fewer pulses, so that in 1 seconds from now I should be at the exact same point as my trusted clock. This is nicer to my computer programs than stepping the clock all the time; it is a smooth FREQUENCY ADJUSTMENT.

I have to make these adjustments all the time though because my Oscillator varies, so what generally ends up happening is I adjust too fast and over shoot the real time, so I adjust to slow down, I under shoot, I speed up, I speed down, etc. If I were to draw my clock adjustments as a graph it would end up looking a bit like a “wave” pattern. It’s not a big deal though because we’re only talking about a couple pulses each time, and my desktop clock is generally hovering around the real time.

I can now keep time accurately, but I have to be in constant communication with the trusted clock, and make constant frequency adjustments. Since I know my Oscillator can lose up to 1/10 of a second every second, that means my computer clock can run freely (or FREERUN) for 10 of my seconds before I could be 1 second out. Lets say I don’t care about seconds, only minutes, so I can free run for at most 600 seconds (or 10 minutes) before I would have to assume my clock could be 1 minute inaccurate. Maybe I do updates once every 10 seconds then, that is enough to keep me happy.

One Oscillator pulse is about 10 milliseconds in my pretend computer, so I can only accurately timestamp an event with 10 millisecond accuracy. There’s no other faster signal to measure by in my PC, so it’s physically impossible to tell the difference between events that happen at 0.001 and 0.002 seconds. Luckily real computer oscillators have much higher frequencies than 100 Hertz, they are on the range of hundreds of MHz (nanosecond accuracy). The same problems still exist though - they need constant adjustment to keep them in line, otherwise they go free running off in some direction.

Luke Bigum Old Father Time