top of page

How NOT To Measure Time in Applications

By Maksymilian Fryc, Senior Software Engineer


If you have been programming for some time, then at some point you needed to measure code performance or track execution time. And maybe you've wondered - "Why is that working so slow?" or “What’s taking so long?“. It can get frustrating, right?


We’ve all been there! Fortunately, there are tools tailored to help with this. For simple or temporary code, you most likely won’t need complex, fancy tools. A lot of the times, the language of your choice already provides all the necessary functionalities. In this article, we’ll dive into different methods of measuring time and some nuances that'll help you choose the desired solution and avoid a common pitfall.


Time measurement types

At the beginning of my career I sometimes heard, “Don’t use this function to measure time, use another one“. So I did, although no one ever told me the difference. After some time had passed, I realized that I should research and learn about the differences. Now, I believe it's worth sharing this knowledge. Let's delve into the two most common methods used for tracking time.


Wall clock

A wall clock, sometimes referred to as "time-of-day," is a straightforward concept. It displays a specific moment in time, much like a regular clock we use in our daily lives. It indicates how many hours have elapsed since either midnight or noon, depending on the type of clock. However, the clock used by the operating system typically measures the time that has passed since a specific date known as the epoch, which can vary depending on the system.


One of the most common dates is January 1st, 1970 (Unix time), However, other systems, like Windows, use January 1st, 1601. So if we take Wednesday, 26 July 2023 12:00:00, and convert it to Unix time, we will get 1690372800, as it’s the number of seconds that have passed between these two dates.


This time can be easily adjusted, for example, when you travel between different time zones or when you synchronize the time with an external server. Or, for fun, when I was young I attempted to change the system time, hoping it would expedite the return of my in-game hero from an adventure (surprisingly, it didn't work!).

And here comes the pitfall - what if you use such a clock to measure time, but a machine that runs your code, unexpectedly, for whatever reason, changes its systems time? It will jump forwards or backwards and mess up the calculations. The results will be wrong.


In the best scenario, you’ll immediately know something is off, but in the worst scenario, you’ll base your hypothesis on erroneous data, which can lead to various consequences. Take note - this kind of measuring method is fine, as long as we know what we’re doing and what potential problems can occur.


Monotonic clock

A monotonic clock, sometimes also referred to as a "timer" or "free-running" clock, might seem similar to the previous one, but it operates more like a stopwatch. When we start the measurement, and later end it, we only care about these 2 points in time and the time that passed between them. It’s not related to other dates or system times. If you start the timer now and stop it after 20 minutes, you will have an exact result that remains unaffected by external factors. It's a simple and reliable method for tracking time, and if feasible, you should consider using this approach.


Java Example

A theory is a theory, but let’s do something in practice. On a daily basis, I use Java, so I’ll also use it for this example. Don't worry if you work with other languages; I'm confident you'll find analogous examples and information on the internet or in documentation. It's worth knowing!


System.currentTimeMillis()

Let’s first use wall clock type and function: System.currentTimeMillis()

long startTime = System.currentTimeMillis();
reportService.calculateProfits();
long endTime = System.currentTimeMillis() - startTime;
log.info("Measurement took {} milliseconds", endTime);

The code provided will save the current time to the "startTime" variable, then perform some report calculations, save the current time again to the "endTime" variable, and ultimately log the difference between these times. While it's a simple and generally reliable approach, even such a straightforward few lines of code can behave unexpectedly if the system time undergoes changes. If the system time were to move backward, the result obtained above would be a negative value!

System.nanoTime()

And the second example uses a monotonic clock type and function: System.nanoTime()

long startTime = System.nanoTime();
reportService.calculateProfits();
long endTime = System.nanoTime() - startTime;
log.info("Measurement took {} nanoseconds", endTime);

It’s almost exactly the same, but this time used function is based on a monotonic clock and the final result would always be correct.


Technology nuances

I've touched on this point briefly, but it's worth emphasizing that all the explanations and mechanisms mentioned above can vary depending on several factors, such as the programming language, operating system, and the specific software and hardware in use.

A good example of this variability is the previously mentioned System.nanoTime() method. The documentation specifies: "This method provides nanosecond precision, but not necessarily nanosecond resolution." This implies that the result will be in nanoseconds, but the operating system beneath may not fully support nanosecond granularity for such operations, so the output may not be perfectly precise to the nanosecond level.


Clock Drift

When speaking about time and computers, we can’t ignore clock drift. This is a common phenomenon where clocks gradually deviate from other clocks and real-time due to the counting of time at slightly different rates. You may also observe this phenomenon in your physical watch, which may drift by a few seconds or even minutes every month.


To highlight how impactful even a small change can be, let’s look at another real-life example from Saudi Arabia, February 25th, 1991. The software error in a MIM-104 Patriot (a military air defense system) led to its system clock drifting by one-third of a second after one hundred hours of operation. This error resulted in the system's failure to locate and intercept an incoming missile, which subsequently struck the Dharan barracks and resulted in the loss of 28 American lives.


These discrepancies in the digital world can be fixed manually or automatically by using a proper mechanism, like - probably the most common - Network Time Protocol.



Network Time Protocol (NTP)

NTP is a widely used protocol designed for clock synchronization and keeping connected devices in sync. I won’t get into too many details, but it works on a layered basis. At the base (layer 0) there are high-precision timekeeping devices (like atomic clocks or machines based on satellites), which are connected to a group of computers (layer 1). There may be many successive, analogous layers, and in all of them, computers communicate with devices from lower layers to receive accurate data, or with devices in the same layer for sanity checks and backup.


This is a great way to keep your device’s time up to date, but as already mentioned, it may lead to unexpected system clock changes and unexpected code behavior.



Summary

After reading this article, it's important to remember the two types of time measurement and their distinctions. It's advisable to rely on monotonic clocks in most cases. Additionally, it's crucial to be more aware of clock drifts and the mechanisms available to help maintain synchronization.


What are your thoughts about the article? Did you learn something new? Let us know!


bottom of page