How accurate is window.setInterval()?

Published

For some reason this question came into my head the other day. I decided to build a Javascript clock to answer the question, mainly because it's more fun than looking at tables or graphs of statistics. It's not very scientific, but it does answer my question quite well - I really wanted to know if the interval would average out over time or not.

There are clocks here - if you can't see them, your browser does not support the canvas element.

In the canvas above there are two clocks, the one on the left always drawn using the current time returned by a new Date(), and the one on the right calculated by adding the number of milliseconds in the interval to the initial time. To give a smooth frame rate of 25 frames per second, the interval is set to 40 milliseconds - so with each interval, 40 milliseconds is added onto the right-hand clock. (The clock in the background is just a scaled-up version of the real time clock on the left.)

The numbers beneath the clocks provide a bit more detail. The first number under each clock counts the milliseconds elapsed for that clock. Under the left-hand clock this is the initial time subtracted from the current time. For the right-hand clock, this is the number of intervals elapsed, multiplied by the requested interval size. The second number under the clock on the left is the mean average interval in milliseconds, calculated by dividing the elapsed time by the number of intervals that have elapsed.

The figures I have seen have varied quite widely depending on the system load, browser software and operating system, but I'll list a few examples here just out of interest.

The Windows browser with the average interval nearest to the ideal of 40 milliseconds appears to be Firefox - versions 3.6 and 4.0 both provide an interval that keeps the clocks fairly closely synchronised. IE9 is also very good, but it is the only browser where the average interval drops below 40ms after a while, meaning that the right-hand clock starts to get ahead of the real time instead of falling behind.

Chrome and Chromium provide an interval that averages just over 41 milliseconds, enough of a difference for the second clock to be visibly slower in well under a minute. Safari comes in at just under 41ms, performing better than Chrome, but still not great. I took these readings under Windows XP, but Chrome actually performed worse under Windows 7 where the interval averaged around 46ms.

On the Mac, things are a bit better. Chrome's interval time is below 41ms, but not by much. The OS X version of Safari is better again, averaging around 40.2ms. Firefox performs as well on OS X as on Windows.

Unsurprisingly, if I change the interval to 1000ms then the browsers perform better. Reducing the interval makes things worse, and you soon come up against the browser's minimum interval. Another thing to note is that navigating using the browser back and forward buttons will reload the script on some browsers, but on others it will just pause the interval timer, leaving the right-hand clock behind by however long you were away from the page.

The only thing here that really surprised me was how well Firefox did - I wasn't expecting any of the browsers to provide very much accuracy. But in any case, the answer to my original question is that setInterval is not much use for measuring time, unless you really don't care about accuracy.

« Previous: SVGGraph 2.2.1 Next: JPEG Saver 4.2.1 »

This site uses cookies - details here.