You need to use ticks_diff() to measure relative times. ticks_ms() increases steadily from the time of the last boot. Further, it's inadvisable to print in an interrupt callback: it's best to use micropython.schedule(). The following script runs for 15 secs and records the maximum and minimum times. Run on the official build times varied from 69ms to 71ms. I saw similar results with a virtual timer.
Code: Select all
from machine import Timer
from utime import ticks_ms, ticks_diff, sleep_ms
from micropython import schedule
timer = Timer(1)
lastrun = None
maxtime = 0
mintime = 1000
def timer_cb(timer):
global lastrun
first_run = lastrun is None
if not first_run:
delta = ticks_diff(ticks_ms(), lastrun)
lastrun = ticks_ms()
if not first_run:
schedule(printme, delta)
def printme(delta):
global maxtime, mintime
print("timer_cb", delta)
maxtime = max(maxtime, delta)
mintime = min(mintime, delta)
def start():
timer.init(period=70, mode=Timer.PERIODIC, callback=timer_cb)
count = 0
while count <= 15000:
sleep_ms(1)
if not (count == 0 or count % 5000):
print(maxtime, mintime, ticks_ms())
count += 1
timer.deinit()
start()
The last few results:
Code: Select all
timer_cb 71
timer_cb 70
timer_cb 69
timer_cb 71
timer_cb 70
timer_cb 69
timer_cb 70
timer_cb 70
timer_cb 70
timer_cb 70
71 69 275935
>>>
If you run it, note the behaviour of ticks_ms(). Note that if the delay in the main loop is increased to 10ms variability increases with the minimum time going down to 61ms. I suspect that the timer interrupt won't pre-empt a sleep() but I would have thought that this would affect maximum as well as minimum times.
[EDIT]
The more I thought about this result the more puzzled I became. How can you get a minimum of 61ms without seeing a maximum of ~81ms? So I ran this script (period changed to 100ms for ease of mental arithmetic):
Code: Select all
from machine import Timer
from utime import ticks_ms, ticks_diff, sleep_ms
from micropython import schedule
timer = Timer(1)
lastrun = None
maxtime = 0
mintime = 1000
def timer_cb(timer):
global lastrun, maxtime, mintime
first_run = lastrun is None
if not first_run:
delta = ticks_diff(ticks_ms(), lastrun)
maxtime = max(maxtime, delta)
mintime = min(mintime, delta)
lastrun = ticks_ms()
if not first_run:
schedule(printme, delta)
def printme(delta):
print("timer_cb", delta, lastrun)
def start():
timer.init(period=100, mode=Timer.PERIODIC, callback=timer_cb)
count = 0
while count <= 300:
sleep_ms(50)
if not (count == 0 or count % 100):
print(maxtime, mintime, ticks_ms())
count += 1
timer.deinit()
start()
The outcome was a max of 102ms and a min of 89. Looking at the absolute times where an 89ms interval occurred I saw
Code: Select all
timer_cb 102 389452
timer_cb 102 389554
timer_cb 102 389656
timer_cb 89 389746
timer_cb 99 389846
Note that the min is 89 not 49...
There are two possibilities. Either the interrupt (and all subsequent ones) occur early. Or sleep_ms() is corrupting ticks_ms(). The former would
really take some explaining; I favour the latter explanation. Currently I haven't figured out a way to prove it.
Does anyone feel like pitching in with some ideas?