reducing memory

The official pyboard running MicroPython.
This is the reference design and main target board for MicroPython.
You can buy one at the store.
Target audience: Users with a pyboard.
User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Re: reducing memory

Post by pythoncoder » Mon Apr 24, 2017 6:52 am

So, to answer the question from @JimTal001, the key is to use the array module.

An array of 16*288 floats will use a little over 16*288*4 = 18,432 bytes. An array of 32 bit integers will use (give or take a few bytes) the same amount of storage. To save RAM the solution is to use an array of half words or even bytes (a bytearray). Whether this is feasible depends on the range of data values and the precision required.

There are other tricks which can be employed, depending on the nature of the data. But these only work if the data is non-random, and they trade reduced RAM space for increased code complexity. For example, slowly changing data can be stored in half words or bytes as deltas. The other obvious (but slow) option is to store the data on a disk device i.e. SD card.
Peter Hinch
Index to my micropython libraries.

JimTal001
Posts: 176
Joined: Thu Jul 30, 2015 4:59 pm

Re: reducing memory

Post by JimTal001 » Mon Apr 24, 2017 9:41 pm

Thanks everyone for your assistance here.

Currently, I'm using list. This is the way I initialize the list:

Code: Select all

data = [[0]*lineCnt for x in range(depthCnt)]
where lineCnt is typically 288 (5 minute readings for 24 hours)
and depthCnt is 7

The values are Temperature in deg C. One point precision is sufficient, but there will be negative values. The temperature range is (-20 to 50 deg C)

I am loading the list as I read through each line:

# read line in file
# parse line into 7 floating point values stored in args[]
...

Code: Select all

for x in range(0, depthCnt):
	dataArray[x][lineCnt] = float(args[x])
			
lineCnt += 1
...

What would you suggest in this case?

User avatar
Roberthh
Posts: 3667
Joined: Sat May 09, 2015 4:13 pm
Location: Rhineland, Europe

Re: reducing memory

Post by Roberthh » Tue Apr 25, 2017 5:11 am

I think the conclusion was to use the array module to preallocate the data. because then it will stay packed. The code for a single block looks quite similar:

Code: Select all

from array import array
lineCnt = 288
depthCnt = 16
data = [ array("f", [0.0 for _ in range(lineCnt)]) for _ in range(depthCnt)]
And then, you can go ahead as previously. If possible, preallocate the arrays early in your code. The total space allocated is about 20 k, which should not be a problem on a PyBoard. If your code is large and the load fails during import with memory error, you could pre-compile it (using mpy-cross) or put it into flash.

Post Reply