msgpack "trimmed down" for MicroPython on small controllers.
Re: msgpack "trimmed down" for MicroPython on small controllers.
Sounds good. I've been pretty happy using micropthon-lib's unittest but it might be too minimal for full CPython compatibility.
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
Asynchronous use: an interesting issue
I've been trying to figure out how to use MessagePack with uasyncio. This matters for slow and/or intermittent data sources such as UARTs and sockets. There is no problem with sending data: just convert it synchronously and await its transmission on a StreamWriter. The difficulty is on receive, and is unique to this protocol when compared to the others supported by MicroPython.
With Pickle and JSON you can use a newline delimiter and await StreamReader.readline(). With ustruct and protocol buffers the receiver knows exactly how many bytes are expected so user code can await StreamReader.readexactly(n). In all these cases the user code can subsequently use the synchronous decoder to decode the received data.
MessagePack has no schema. The receiver deduces how many bytes more bytes it needs from the contents of the data already decoded. Delimiters can't be used because the format can include binary data. You would asynchronously await the first byte and from that deduce how many bytes to read next, which may then be awaited with .readexactly(). This process can be recursive if the encoded data structure is such.
The only solution I can see is a separate asynchronous version where load() is replaced with asynchronous code. Internally, calls to _read_except(fp, n) would become await fp.readexactly(n) where fp is a StreamReader instance provided by the calling code.
Any thoughts?
With Pickle and JSON you can use a newline delimiter and await StreamReader.readline(). With ustruct and protocol buffers the receiver knows exactly how many bytes are expected so user code can await StreamReader.readexactly(n). In all these cases the user code can subsequently use the synchronous decoder to decode the received data.
MessagePack has no schema. The receiver deduces how many bytes more bytes it needs from the contents of the data already decoded. Delimiters can't be used because the format can include binary data. You would asynchronously await the first byte and from that deduce how many bytes to read next, which may then be awaited with .readexactly(). This process can be recursive if the encoded data structure is such.
The only solution I can see is a separate asynchronous version where load() is replaced with asynchronous code. Internally, calls to _read_except(fp, n) would become await fp.readexactly(n) where fp is a StreamReader instance provided by the calling code.
Any thoughts?
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
Re: msgpack "trimmed down" for MicroPython on small controllers.
That seems about right; there's no other way then to split up reading into known hence awaitable chunks, and it looks like _read_except is the right candidate for those chunks. So as usual with converting to async code, it's going to be a matter of duplicating and inserting a lot of async/await keywords.
Re: msgpack "trimmed down" for MicroPython on small controllers.
When I prepared the first version of MessagePack-based RPC (http://ftp.funet.fi/pub/archive/alt.sources/2722.gz), it was possible to feed the unpacker with the incomming data and check when the complete object is available: https://gitlab.com/WZab/python-versatil ... ple.py#L44
Of course, it was not the most energy-efficient approach (probably), as the buffer was scanned many times.
After that I always used msgpack-packed structures encapsulated in the additional message layer - either ZeroMQ ( https://gitlab.com/WZab/python-versatil ... ck-zmq/src ) or minimalisticTCP encapsulation, where message was prepended with its length ( https://gitlab.com/WZab/python-versatil ... ck-tcp/src ).
Of course, it was not the most energy-efficient approach (probably), as the buffer was scanned many times.
After that I always used msgpack-packed structures encapsulated in the additional message layer - either ZeroMQ ( https://gitlab.com/WZab/python-versatil ... ck-zmq/src ) or minimalisticTCP encapsulation, where message was prepended with its length ( https://gitlab.com/WZab/python-versatil ... ck-tcp/src ).
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
Re: msgpack "trimmed down" for MicroPython on small controllers.
Thanks for that. The "trial and error" approach is rather inefficient.
Prepending every message with its length is a good idea and would play nicely with uasyncio. It could be baked into the dump, dumps, load and loads routines so that it was transparent to the user. The drawbacks are that it breaks the standard and adds two bytes to every message. It would also use RAM because the packer must buffer the message during creation until the message length is known. It then prepends the length and outputs it. I think the RAM use is a serious problem.
My solution maintains standard compliance and makes no changes to the encoder. It adds an optional asynchronous decoder module which provides an aload method. You can write
This allows other tasks to run while receiver is paused. The drawback is another Python module. I have tested this with a UART and an intermittent data source and it seems OK.
Prepending every message with its length is a good idea and would play nicely with uasyncio. It could be baked into the dump, dumps, load and loads routines so that it was transparent to the user. The drawbacks are that it breaks the standard and adds two bytes to every message. It would also use RAM because the packer must buffer the message during creation until the message length is known. It then prepends the length and outputs it. I think the RAM use is a serious problem.
My solution maintains standard compliance and makes no changes to the encoder. It adds an optional asynchronous decoder module which provides an aload method. You can write
Code: Select all
async def receiver():
sreader = asyncio.StreamReader(uart)
while True:
res = await aload(sreader) # Pause this task until a complete message is received
print('Recieved', res)
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
Re: msgpack "trimmed down" for MicroPython on small controllers.
I've posted code here. The README is unfinished but comments on the code are welcome.
I have refactored the code as a Python package. This enables lazy imports, minimising RAM consumption.
There is also a very simple extension module which adds support for complex, set and tuple types. It serves as a demo showing how easy it is to extend the protocol.
The asynchronous demo was tested on a Pyboard 1.1. The transmit task periodically serialises a changing Python object and transmits it on a UART. The receiving task awaits data and decodes it as it arrives, printing the final result. RAM usage was 18.1KiB with no use of frozen bytecode.
The README has a summary of the changes to the original code.
I have refactored the code as a Python package. This enables lazy imports, minimising RAM consumption.
There is also a very simple extension module which adds support for complex, set and tuple types. It serves as a demo showing how easy it is to extend the protocol.
The asynchronous demo was tested on a Pyboard 1.1. The transmit task periodically serialises a changing Python object and transmits it on a UART. The receiving task awaits data and decodes it as it arrives, printing the final result. RAM usage was 18.1KiB with no use of frozen bytecode.
The README has a summary of the changes to the original code.
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
Re: msgpack "trimmed down" for MicroPython on small controllers.
Barring bugfixes I think I've taken this as far as I can. I have posted this message with the intention of taking this to a wider audience.
The latest release V0.1.1 improves the support for extensions. These can now be added in a way which is completely transparent to the application. With the extension module in place your data can include complex, tuple and set objects as if they were native types. This can easily be taken further to include other types.
I would welcome any comments on the ext_handlers option - like other options I retained it from the original. I can't see a use for ext_handlers that can't be better achieved with the ext_serializable decorator. It seems rather clunky but I may well be missing something.
The latest release V0.1.1 improves the support for extensions. These can now be added in a way which is completely transparent to the application. With the extension module in place your data can include complex, tuple and set objects as if they were native types. This can easily be taken further to include other types.
I would welcome any comments on the ext_handlers option - like other options I retained it from the original. I can't see a use for ext_handlers that can't be better achieved with the ext_serializable decorator. It seems rather clunky but I may well be missing something.
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
Re: msgpack "trimmed down" for MicroPython on small controllers.
Don't have a use for ext_handlers myself currently but once vacation is over I'll check if I can use it where we have msgpack-python now.