Thanks to the Bad Astronomer I heard about the scheduled leap second yesterday before it happened. For most people on the planet this event should have been completely unnoticed. I however have been running timekeeping accuracy tests for a new product so it would affect the test results. Since I ‘m using a standard NTP time sync, not actually directly syncing with an atomic clock, I wasn’t sure what the error would be like. In fact just looking at the basic statistics from the logged data I couldn’t see the error. So I graphed the errors and the leap second effect became clearly visible.
It looks like the NTP servers involved (there are multiple servers) changed the abrupt midnight UTC leap second adjustment into a gradual adjustment starting around midnight local time (EDT). Also note that leap second error fell within the already recorded error bounds which is why the basic statistics for the data didn’t show the leap seconds contribution. This reinforces my mantra when it comes to data analysis, simple graphing reveals far more about oddities of a data set than any numerical statistical analysis. Check out NIST/SEMATECH e-Handbook of Statistical Methods for many more examples of how graphs reveal more about data than numerical statistics.