OSError: [Errno 5]

General discussions and questions abound development of code with MicroPython that is not hardware specific.
Target audience: MicroPython Users.
nelfata
Posts: 74
Joined: Wed Apr 30, 2014 10:50 pm

Re: OSError: [Errno 5]

Post by nelfata » Wed Oct 15, 2014 5:25 am

So it must be memory related.
I have modules that are being loaded in main, and after all the modules are imported I am trying the SDIO test and the test I pasted earlier.
It fails for 2048 blocks. If I disable any imports, then I start seeing what you are seeing.

Here is the output of pyb.info():
qstr:
n_pool=6
n_qstr=727
n_str_data_bytes=12000
n_total_bytes=17136
GC:
101760 total
73792 : 27968
1=887 2=597 m=161
LFS free: 92160 bytes


There is 27968 of free memory, this should be sufficient for reading a file with 2048 block size.

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: OSError: [Errno 5]

Post by dhylands » Wed Oct 15, 2014 7:15 am

Right - I've been nailed by this as well (switching from Python2 to Python 3).

Code: Select all

>>> b'' == ''
False
When I changed if a == '' to a == b'' or if len(a) == 0 then it exits the loop.

Files open in binary mode return binary strings (hence the b prefix) whereas files in text mode return unicode strings.

nelfata
Posts: 74
Joined: Wed Apr 30, 2014 10:50 pm

Re: OSError: [Errno 5]

Post by nelfata » Wed Oct 15, 2014 12:47 pm

At least we are both seeing the same results :)

Can you please tell me why with 27K of memory left, reading with 2048 block size fails?
Does the underlying read IO use that much RAM?

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: OSError: [Errno 5]

Post by dhylands » Wed Oct 15, 2014 2:39 pm

You could be running into heap fragmentation.

Here's a contrived case that demonstrates that:

Code: Select all

import gc

def read_file():
    l=0
    print("About to open file")
    aFile = open('/sd/data.bin', 'rb')
    print("After opening file")
    while True:
        try:
            a = aFile.read(2048)
            if a == b'':
                print('done')
                break
        except OSError:
            print('OSError')
            break
        except :
            print('Some error')
            break
        l = l + len(a)
        print(l)
    aFile.close()

data = []

def consume_memory():
    global data
    read_file()
    try:
        for i in range(128):
            data.append(bytearray(1024))
    except MemoryError:
        for i in range(1, len(data), 2):
            data[i] = None
    print("mem_free  = {}".format(gc.mem_free()))
    print("mem_alloc = {}".format(gc.mem_alloc()))
    gc.collect()
    print("mem_free  = {}".format(gc.mem_free()))
    print("mem_alloc = {}".format(gc.mem_alloc()))
    read_file()

consume_memory()
which for me, produces:

Code: Select all

>>> import cup
About to open file
After opening file
2048
4096
6144
8192
10240
12288
14336
16384
18432
20480
22528
24576
25517
done
mem_free  = 2048
mem_alloc = 100240
mem_free  = 49904
mem_alloc = 52384
About to open file
After opening file
Some error
So even though there is 49K free, there isn't any chunk big enough to hold 2048 bytes.

nelfata
Posts: 74
Joined: Wed Apr 30, 2014 10:50 pm

Re: OSError: [Errno 5]

Post by nelfata » Wed Oct 15, 2014 2:48 pm

Oh, interesting. I thought gc.collect() performs some defragmentation.
If not, what would you recommend in this case?

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: OSError: [Errno 5]

Post by dhylands » Wed Oct 15, 2014 2:52 pm

I tried to preallocate a buffer and use readinto, but readinto doesn't seem to be supported, so I'm going to open an issue in github.

nelfata
Posts: 74
Joined: Wed Apr 30, 2014 10:50 pm

Re: OSError: [Errno 5]

Post by nelfata » Wed Oct 15, 2014 3:09 pm

Well thank you for the useful feedback.

I think most users will face this problem, there are ways we can reduce the fragmentation with pre-allocating buffers, but eventually we will need this flexibility.
If there is something we can do about running a process periodically to get the memory chunks moved that would be great, or if there is a way in the memory allocator to control the allocated bytes. For example for small buffers allocate a multiple of 64 bytes, and for larger buffer a multiple of 1024.
This will reduce the fragmentation, it is just an idea, but some RTOSs do similar things.

User avatar
dhylands
Posts: 3821
Joined: Mon Jan 06, 2014 6:08 pm
Location: Peachland, BC, Canada
Contact:

Re: OSError: [Errno 5]

Post by dhylands » Wed Oct 15, 2014 3:21 pm

nelfata wrote:Oh, interesting. I thought gc.collect() performs some defragmentation.
If not, what would you recommend in this case?
The gc can't move objects, since it would need to know where all of the references to the object are and update them. All it can do is free up objects which have no references.

The problem is that other in memory objects, like an int, could contain a value which looks like a pointer and the gc doesn't know this. So it will err on the side of caution and not free a block that had an int with a value that looks like a pointer, but it can't go updating those in case they aren't really pointers

nelfata
Posts: 74
Joined: Wed Apr 30, 2014 10:50 pm

Re: OSError: [Errno 5]

Post by nelfata » Wed Oct 15, 2014 4:07 pm

So for the moment it is not possible to defragment memory.

Damien
Site Admin
Posts: 647
Joined: Mon Dec 09, 2013 5:02 pm

Re: OSError: [Errno 5]

Post by Damien » Thu Oct 16, 2014 10:07 pm

No, unfortunatelly there is no way to defragment memory. It's designed to be ok (ie not need defragmentation) for small objects like floats, short lists and tuples, etc. But for large (kb) buffers, you are best pre-allocating them.

You can see what the current state of the heap is by running: pyb.info(1). This will print out a representation of the heap and its used blocks.

Post Reply