Pre-Compiled files using more memory?

Questions and discussion about The WiPy 1.0 board and CC3200 boards.
Target audience: Users with a WiPy 1.0 or CC3200 board.
tallyboy91
Posts: 7
Joined: Sun Aug 28, 2016 9:21 pm

Pre-Compiled files using more memory?

Post by tallyboy91 » Sun Aug 28, 2016 9:30 pm

Yesterday I ran into memory allocation failed issues.

My understanding is that pre-compiling the modules would help alleviate those issues. So, I made the modifications to the code as mentioned here (viewtopic.php?t=1962 ), compiled and flashed the WiPy successfully.

I compiled several modules using mpy-cross, uploaded to WiPy and restarted.

I'm now seeing that there is much less memory available using gc.mem_free(). Even after running a GC there is less memory than before I pre-compiled the modules.

Am I missing something?

Thanks,

aw

Online
User avatar
Roberthh
Posts: 3667
Joined: Sat May 09, 2015 4:13 pm
Location: Rhineland, Europe

Re: Pre-Compiled files using more memory?

Post by Roberthh » Mon Aug 29, 2016 5:42 am

Hello @tallyboy91. For Wipy you just need the change to mpconfigport.h:

Code: Select all

#define MICROPY_PERSISTENT_CODE_LOAD (1)
For RAM usage please note, that at runtime the precompiled modules use the same amount of RAM than the straight ones. It just saves RAM during compile time and allows to compile much larger modules. If you tell that you now have much less RAM, it could be caused by now loading a more recent version of MicroPYthon with more features. On WIPy, all code is run from RAM, so the newly added code also requires some RAM. On my system (V 1.8.1) after boot w/o any modules loaded, there is about 50k of RAM available. What are your figures?

tallyboy91
Posts: 7
Joined: Sun Aug 28, 2016 9:21 pm

Re: Pre-Compiled files using more memory?

Post by tallyboy91 » Mon Aug 29, 2016 11:46 pm

It's been one of those days... I wanted to get some concrete numbers on the differences and ran into problems where importing a module would only show module.__name__. Then I wasn't paying attention and blew away my main.py file without a backup of course :evil:

Anyway, to answer your question, I have about 50K of free memory using 1.8.3 when I have no extra modules loaded.

I ran a test of loading regular modules with the normal 1.8.3 binary and then compared it to using only .mpy modules. To your point, there was a negligible difference in memory usage.

What I did finally figure out (and maybe this will help other people) is that I had been running out of space (RAM) when flashing/uploading files. In my hurry (and ignorance), I didn't pay attention to the fact that modules were just blank files. And they imported without any errors so I didn't realize what was happening. Lesson learned the hard way.

Admittedly, I'm going to have to change my mindset to focus on memory usage above just about everything else. A bit of a paradigm shift for me.

On a somewhat related note... if I put all my modules/files on a MicroSD card, can I bump up the memory available for running/compiling python code? I'm assuming I could somewhere in the code but have no clue where to begin. Plus my C "skills" are laughable.


Thanks!

aw

Online
User avatar
Roberthh
Posts: 3667
Joined: Sat May 09, 2015 4:13 pm
Location: Rhineland, Europe

Re: Pre-Compiled files using more memory?

Post by Roberthh » Tue Aug 30, 2016 10:38 am

Sorry, I did not see your reply earlier. As far as I understand, putting your files & code to SD card would not help. The CC3200 has not built-in flash. All code is loaded from external flash to RAM and executed there, and there is no paging mechanism like the ESP chip has, that would allow loading just the segments of code that are needed. Therefore RAM stays the limiting factor, and your code must fit into RAM during runtime. Pre-compiling avoids running out of memory at compile time, which is a difference.

If you compile on board, you are limited to about 400 lines of Python code. If pre-compiling, about 1500 lines of code should be OK, not considering the require data space. And pre-compiling will tell you immediately syntax errors.

tallyboy91
Posts: 7
Joined: Sun Aug 28, 2016 9:21 pm

Re: Pre-Compiled files using more memory?

Post by tallyboy91 » Tue Aug 30, 2016 1:02 pm

@Roberthh

Thanks! This is great information and completely makes sense.

Currently, I'm using code for weather related sensors that has been ported over from the AdaFruit python code. It seems to me that the way to go is to first make sure that code works correctly. Then "optimize" it by removing any unneeded code as well as combining methods like read_raw_temp/read_temperature into a single method.

I'm also assuming that replacing constants such as BME280_REGISTER_CONTROL_HUM = 0xF2 to just 0xF2 within the code will also save space in heap? Admittedly, it certainly makes the code harder to read but I could always use the original code as a reference.

Are there any other tips/tricks on conserving memory?


Thanks!

aw

Online
User avatar
Roberthh
Posts: 3667
Joined: Sat May 09, 2015 4:13 pm
Location: Rhineland, Europe

Re: Pre-Compiled files using more memory?

Post by Roberthh » Tue Aug 30, 2016 7:00 pm

Using constants instead of variables saves memory. Micropython has the syntax of:

Code: Select all

BME280_REGISTER_CONTROL_HUM = const(0xF2)
which according to the spec is doing the same, by keeping readability. But there was a thread somewhere which objects to that claim. So it must be tested. In order to save memory and time at runtime, you should not reallocate buffers over and over again. So instead of
data = read(....)
you shoud
data= bytes(n)
readinto(data, ...)
or the like, just to explain the principle. This has been discussed at several places.
As long as you use mpy_cross, the source code style has no effect on memory usage. If you compile on-board, shorter source files are less likely to run out of memory during compile & load, but the exact rules are hard to define.

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Re: Pre-Compiled files using more memory?

Post by pythoncoder » Wed Aug 31, 2016 6:39 am

@Roberthh Not quite, I'm afraid. The const() statement works as follows. If you write

Code: Select all

MYCONST = const(5)
def foo():
	a = MYCONST
the compiler will compile a = 5. In other words it will be faster and the code will be smaller because it won't perform a read from the MYCONST global. However the global will still use RAM. This is because another module might import MYCONST. To save RAM you need to write

Code: Select all

_MYCONST = const(5)
def foo():
	a = _MYCONST
In this case the global is unavailable for import, so MicroPython performs an additional optimisation and saves 4 bytes of RAM.
Peter Hinch
Index to my micropython libraries.

Online
User avatar
Roberthh
Posts: 3667
Joined: Sat May 09, 2015 4:13 pm
Location: Rhineland, Europe

Re: Pre-Compiled files using more memory?

Post by Roberthh » Wed Aug 31, 2016 6:49 am

Aah, now I remember. That was the discussion about the leading underline and whther it's use is consistent in the Python space.

tallyboy91
Posts: 7
Joined: Sun Aug 28, 2016 9:21 pm

Re: Pre-Compiled files using more memory?

Post by tallyboy91 » Wed Aug 31, 2016 1:21 pm

Great information! I need to start compiling a list of these kind of tips/optimizations.

danielm
Posts: 167
Joined: Mon Oct 05, 2015 12:24 pm

Re: Pre-Compiled files using more memory?

Post by danielm » Wed Oct 05, 2016 5:31 pm

So, is there already some list of tips/optimizations? :)

Post Reply