pyb.delay accuracy

General discussions and questions abound development of code with MicroPython that is not hardware specific.
Target audience: MicroPython Users.
Post Reply
guanabana
Posts: 15
Joined: Mon Jan 04, 2016 8:45 pm

pyb.delay accuracy

Post by guanabana » Sat Feb 13, 2016 5:58 pm

Hi, has anyone noticed that pyb.delay is wildly inaccurate the first time it is invoked? I am trying to send SPI messages separated by a fixed delay of 1ms, but I am measuring that the first and second SPI messages can be anywhere from 60us to 900us apart. From the second message on, the separation is 1ms consistently. See code below.

My workaround is to send a sacrificial delay before the start of my actual timing-critical block, but I am wondering why this would be.

Thanks,
Steve

Code: Select all

from pyb import SPI
spi = SPI(1, SPI.MASTER, baudrate=10000000, polarity=0, phase=1)    # ~ 10.50 MHz

data = bytearray(4)
data[1] = 0xa1

data2 = bytearray(4)
data2[1] = 0xa2

data3 = bytearray(4)
data3[1] = 0xa3

data4 = bytearray(4)
data4[1] = 0xa4

DELAY = 1

#pyb.delay(DELAY)	# sacrificial delay here

spi.send(data)
pyb.delay(DELAY)

spi.send(data2)
pyb.delay(DELAY)

spi.send(data3)
pyb.delay(DELAY)

spi.send(data4)
pyb.delay(DELAY)

guanabana
Posts: 15
Joined: Mon Jan 04, 2016 8:45 pm

Re: pyb.delay accuracy

Post by guanabana » Sat Feb 13, 2016 6:14 pm

This is on the pyboard with this version of micro python btw:

MicroPython v1.5.1-66-g66b9682 on 2015-12-04; PYBv1.1 with STM32F405RG

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: pyb.delay accuracy

Post by dhylands » Sat Feb 13, 2016 6:24 pm

pyb.delay uses the millisecond interrupt handler to determine how long to delay.

When you call it (every time not just the first time), you can get upto 1 millisecond of variation depending on where you are (in time) relative to the tick interrupt.

Once you call pyb.delay once, you "align" yourself to the interrupt, and if you do a consistent amount of work before calling it the next time, then you'll get a consistent variation.

If you use pyb.udelay instead, it watches the same microsecond timer used for pyb.delay, but it doesn't use the interrupt. It actually waits for the appropriate number of usec ticks to go by. Using udelay is probably accurate to within about 10 useconds (just a guess on my part based on the approx amount of time a bytecode takes to execute). Using delay will only be accurate to within 1 millisecond.

Post Reply