Page 1 of 1

How to measure time accurately?

Posted: Tue Sep 24, 2019 11:17 pm
by manseekingknowledge
There is a GitHub issue related to the inability of the ESP8266 to keep accurate time:

https://github.com/micropython/micropython/issues/2724

The issue has been open for nearly 3 years and has an associated pull request, but it appears there is some debate about the integrity of the change.

My testing shows that time.time() is virtually unusable (fell 29 seconds behind in a 10 minute test) and time.ticks_ms() is only slightly better (fell 10 seconds behind in a 10 minute test). How are people getting around this issue? For my specific use case, I don't have Internet access.

Re: How to measure time accurately?

Posted: Wed Sep 25, 2019 5:18 am
by kevinkk525
A more reliable esp8266 time would certainly be nice.
I synchronize with an ntp server every few hours but I don't need a very high accuracy.
If you have no internet access and no local ntp server then this could be a problem.

Re: How to measure time accurately?

Posted: Wed Sep 25, 2019 8:18 am
by pythoncoder
Another approach is to use an external RTC like the DS3231. These are cheap, have timepiece accuracy and are easy to use with the above driver.

Re: How to measure time accurately?

Posted: Wed Sep 25, 2019 1:12 pm
by jomas
manseekingknowledge wrote:
Tue Sep 24, 2019 11:17 pm
and time.ticks_us() is only slightly better (fell 10 seconds behind in a 10 minute test).
Then you made a mistake or your device is broken. time.ticks_us() is very accurate. (about 2 seconds per 24 hours or less).
Are you aware that ticks_us() uses a 30-bits counter that overflows after about 17 minutes?

Re: How to measure time accurately?

Posted: Wed Sep 25, 2019 2:00 pm
by manseekingknowledge
jomas wrote:
Wed Sep 25, 2019 1:12 pm
manseekingknowledge wrote:
Tue Sep 24, 2019 11:17 pm
and time.ticks_us() is only slightly better (fell 10 seconds behind in a 10 minute test).
Then you made a mistake or your device is broken. time.ticks_us() is very accurate. (about 2 seconds per 24 hours or less).
Are you aware that ticks_us() uses a 30-bits counter that overflows after about 17 minutes?
My mistake was a typo in my post. I meant ticks_ms, not ticks_us. I've correct the post. Thanks.

I am aware of the roll over. Maybe we could keep track of a multiplier which increments at every roll over?

Re: How to measure time accurately?

Posted: Wed Sep 25, 2019 4:07 pm
by jomas
manseekingknowledge wrote:
Wed Sep 25, 2019 2:00 pm

My mistake was a typo in my post. I meant ticks_ms, not ticks_us. I've correct the post. Thanks.
But even then then you can not have a mismatch of 10 seconds in 10 minutes because ticks_ms uses ticks_us.

And yes, if you keep track of the overflows you can use that as your time.
I have made a digital clock that works that way. It sync's (not set) with a ntp-server, (a kind of PI controller) which keeps the clock accurate within a few mS at any time.

If you want to use it that way without internet, thats possible too. You should then use ticks_us and on every overflow you can then add/subtract a few uS extra as a correction. And believe me, the crystal that is used for the clock of the ticks_us is very stable.

Re: How to measure time accurately?

Posted: Sat Oct 16, 2021 7:09 am
by jad7
Are there any updates on this? Maybe some existing external libs for virtual timers which are based on time_ms or time_us? Micropython v1.17 has now been released, but the problem still exists :? .