Some problems with asyn library #2

Discussion about programs, libraries and tools that work with MicroPython. Mostly these are provided by a third party.
Target audience: All users and developers of MicroPython.
Post Reply
User avatar
on4aa
Posts: 70
Joined: Sat Nov 11, 2017 8:41 pm
Location: Europe
Contact:

Some problems with asyn library #2

Post by on4aa » Tue Jan 23, 2018 12:24 am

I have two uasyncio coroutines awaiting the same event.
One coroutine is a slow file write, the other one is a fast LCD screen refresh.

When the event is set (triggered), is it possible that the slow file write will withhold the LCD screen from refreshing quickly?

Do I need to encapsulate the ordinary Python file write in a uasyncio.StreamWriter() object?

The reason why I am asking is because of reading the following passage in an article:
blocking library functions are incompatible with async frameworks.
So, there's a bunch of things in Python Standard library that are assigned as blocking functions: socket.*, select.*, subprocess.*, os.waitpid, threading.*, multiprocessing.*, time.sleep. Everything that has to do with networking, processes, threads, you cannot use them. This is true for every async framework. If you use these functions the thing is gonna hang. So, don't use them. It's very unfortunate.
Serge

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Re: uasyncio — Two coroutines awaiting the same event

Post by pythoncoder » Tue Jan 23, 2018 9:41 am

Any coroutine monopolises the CPU while it is running. When it issues await it hands over execution to another coro. So the answer is yes, if the slow coro happens to be scheduled first, the screen update will be delayed until the slow coro issues await or terminates.

The solution is to use nonblocking drivers. For example, if doing network programming, you need to ensure that socket instances are non-blocking. Note that this isn't just a matter of changing the mode and expecting existing code to run: using nonblocking sockets requires a different approach. For example a read will return immediately regardless of whether any data is yet ready (that's what non-blocking means).

It is possible to use the IORead mechanism to achieve nonblocking I/O to streams such as file devices. I haven't got a ready made sample but this example does concurrent to a UART which is also a stream type device. (Link X1 and X2 if you want to run it).

Code: Select all

import uasyncio as asyncio
from pyb import UART
uart = UART(4, 9600)

async def sender():
    swriter = asyncio.StreamWriter(uart, {})
    while True:
        await swriter.awrite('Hello uart\n')
        await asyncio.sleep(2)

async def receiver():
    sreader = asyncio.StreamReader(uart)
    while True:
        res = await sreader.readline()
        print('Recieved', res)

loop = asyncio.get_event_loop()
loop.create_task(sender())
loop.create_task(receiver())
loop.run_forever()
While I haven't actually tried it you should be able to declare a StreamWriter for a file and perform background file writes - the await swriter.awrite('Hello uart\n') allows other coroutines to run while the write takes place.
Peter Hinch
Index to my micropython libraries.

User avatar
on4aa
Posts: 70
Joined: Sat Nov 11, 2017 8:41 pm
Location: Europe
Contact:

Re: uasyncio — Two coroutines awaiting the same event

Post by on4aa » Fri Jan 26, 2018 8:47 pm

As a side note, I was wondering whether one can mix and match uasyncio with threading.
More specifically: May a coroutine start a new thread?
Serge

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Re: uasyncio — Two coroutines awaiting the same event

Post by pythoncoder » Sat Jan 27, 2018 7:29 am

I don't see why not but this is outside my experience. Lacking a four digit IQ I avoid pre-emptive scheduling like the plague ;)
Peter Hinch
Index to my micropython libraries.

Post Reply