MostlyHarmless wrote: ↑
Fri Jan 10, 2020 9:28 pm
tve wrote: ↑
Fri Jan 10, 2020 9:06 pm
I'm puzzled by the need to recalibrate every few seconds... Isn't the variation primarily due to temperature fluctuations?
I'm going to test with longer intervals, but keep in mind that we are talking about an R/C oscillator in an environment of unknown cleanliness. Any amount of flux residue on the PCB can wreak havoc on that when pulling an adjacent line low or high or the cat just sneezing next to it.
But I agree, it does sound a bit too aggressive.
I checked and for now I'm going to keep the 2 second interval for re-calibrating the 8266 using system_rtc_clock_cali_proc().
The source code for that function is not open source. The function only appears in header files and libmain.a of the SDK, so I cannot verify any of the below by inspecting the code.
From what I could find out the cali_proc is using the XTAL based high speed CPU to measure the interval of the RTC clock source. This interval is expressed in microseconds as a fixed point binary value with 12 decimal bits. In my case the clock source is the internal 150kHz oscillator. So 1 million microseconds divided by 150,000Hz is 6.6667us times 4096 (12 bits to the left) is 27,306. The actual return value of system_rtc_clock_cali_proc() on my esp8266 varies from 25,500 to 25,650 within 20-30 seconds without significant changes in the environment or workload. Which means that my oscillator is wildly jittering around 160kHz.
The esp8266 doesn't really have a full RTC. At least not one that keeps date and time. MicroPython simulates one. There is a counter that is increased automatically with each pulse on the clock source. So that counter is incremented 150,000 times per second (well 160,000 times in my case). The current counter value is retrieved with system_get_rtc_time(). This function (also hidden in the binary blob libmain.a) only returns uint32, so it will overflow in a little less than every 8 hours (this math matches the comments in ports/esp8266/machine_rtc.c). To internally calculate the number of microseconds since epoch the MicroPython RTC simulation remembers a "delta" and the current "calibration" in RTC memory. "delta" is initially calculated on RTC.datetime((time-tuple)). When later RTC.datetime() is called to get the current time, MicroPython internally calculates the number of microseconds since 2000-01-01 by multiplying system_get_rtc_time() with the calibration value and adding the delta. It also has logic to deal with the overflow, by moving delta forward by 4 billion times calibration microseconds when it detects the RTC counter has gone backwards.
Doing the re-calibration every 2 seconds seems to provide enough averaging over the general jitter to make the clock "only" wander off by up to 50ms between NTP server polls. This easily doubles or more when using 10 second intervals. And that while using a local, low-latency NTP server.
This ntpclient implementation for the esp8266 is certainly "better" than using ntptime every 15-20 minutes. Especially since it keeps the clock within some limits and most importantly monotonic. The results are definitely not as good as the ones achievable with an esp32. You get what you pay for.