MemoryError on importing large files

Discussion about programs, libraries and tools that work with MicroPython. Mostly these are provided by a third party.
Target audience: All users and developers of MicroPython.
Post Reply
User avatar
chrizztus
Posts: 16
Joined: Thu Feb 23, 2017 3:59 pm

MemoryError on importing large files

Post by chrizztus » Mon May 20, 2019 12:33 pm

Hi community,
for a robotic project I build a python framework consisting of several files. The files implement the robots behaviour, mainly line following and sensing colors. When I upload these files and run the code everything works fine.
Since the framework is a work-in-progress framework we every once in a while need to provide an update for the framework. So we wrote a script that concatenates all python files into one large file and removes redundant imports. This time the import fails with a memory error:

Code: Select all

>>> from mybot import *
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
MemoryError:
or even

Code: Select all

>>> from mybot import MyBot
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
MemoryError: memory allocation failed, allocating %u bytes
The size of the file is 17 kB and micropython is being run on a nrf52 microcontroller.
Can somebody please explain to me whe memory allocation fails for one large file but not for several small ones (having the same content)?

Kind regards
Christian

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: MemoryError on importing large files

Post by dhylands » Mon May 20, 2019 1:04 pm

It's because the large file needs to be compiled into bytecode. When the smaller files are being compiled the compilation process takes less memory with the smaller files.

You could also try pre compiling the large python file into a .mpy file (using mpy-cross) and uploading the .mpy file.

User avatar
jimmo
Posts: 2754
Joined: Tue Aug 08, 2017 1:57 am
Location: Sydney, Australia
Contact:

Re: MemoryError on importing large files

Post by jimmo » Mon May 20, 2019 1:06 pm

And if it's an option for you, try freezing the rarely-changing modules into a firmware build. (By putting the code in boards/<BOARD>/modules). This means that they'll execute from ROM, saving even more RAM for your main program.

User avatar
chrizztus
Posts: 16
Joined: Thu Feb 23, 2017 3:59 pm

Re: MemoryError on importing large files

Post by chrizztus » Mon May 20, 2019 2:09 pm

dhylands wrote:
Mon May 20, 2019 1:04 pm
It's because the large file needs to be compiled into bytecode. When the smaller files are being compiled the compilation process takes less memory with the smaller files.

You could also try pre compiling the large python file into a .mpy file (using mpy-cross) and uploading the .mpy file.
I understand. With pre compiling everything works as expected and regarding to mem_free() I additionally saved 1300 bytes of RAM :idea:

User avatar
chrizztus
Posts: 16
Joined: Thu Feb 23, 2017 3:59 pm

Re: MemoryError on importing large files

Post by chrizztus » Mon May 20, 2019 2:14 pm

jimmo wrote:
Mon May 20, 2019 1:06 pm
And if it's an option for you, try freezing the rarely-changing modules into a firmware build. (By putting the code in boards/<BOARD>/modules). This means that they'll execute from ROM, saving even more RAM for your main program.
Thanks jimmo. We usually try to freeze python modules, especially if it concerns our own modules we can release with a firmware update. But, since this is a project for a customer, we have the requirement of keeping all modules updateable.
Regards
Christian

Post Reply