Can there be 2 python programs running at the same time?

General discussions and questions abound development of code with MicroPython that is not hardware specific.
Target audience: MicroPython Users.
Post Reply
donmerch
Posts: 11
Joined: Sat Mar 28, 2020 3:07 pm

Can there be 2 python programs running at the same time?

Post by donmerch » Tue Apr 14, 2020 6:35 pm

I was watching a video about running a program in the background by appending the command with & but more importantly that 2 programs could be run simultaneously and it got me to thinking about a project I'm working on that has a repetitive data stream running but needs to respond to changes from a UI to modify the data stream. I was looking at uasyncio but the UI obviously holds up the program until data is entered.

So I got to thinking if I had a small program running on its own that is streaming the data out and would acquire the data it needs to stream from a stored file, it would open the file periodically to read and then close the file while the main program that contains the UI would write to that file and then close it. I realize that there is a great possibility of a read/write collision which I'm guessing would trigger an exception that could be handled to write again when it's safe to do so.

Does this make any sense and can 2 python programs run together in MicroPython on a PyBoard?

Here's a link to the video that sparked the idea https://youtu.be/3vNIfBbibgE

User avatar
jimmo
Posts: 2754
Joined: Tue Aug 08, 2017 1:57 am
Location: Sydney, Australia
Contact:

Re: Can there be 2 python programs running at the same time?

Post by jimmo » Wed Apr 15, 2020 12:14 am

There's lots of things going on here... In that video, the "&" is about Unix job control and related concepts. The result is that the command runs in a new process (well...group of processes if it involves pipes etc) and the shell doesn't wait for it. (This isn't an AWS thing, it's just how Unix works and all shells (e.g. bash in this case) have some way to do this).

Generally though what you're talking about is multithreading. i.e. in one program you can create multiple threads of execution. (How this maps to underlying operating systems like processes, etc, varies between programming languages and operating systems).

MicroPython supports threads, via the _thread module. You'll need the "threading" firmware at http://micropython.org/download/pybv1/ and the documentation is here http://docs.micropython.org/en/latest/l ... hread.html

Like you say though, using threads mean you now have concurrency issues, and so for almost all situations it's very hard to recommend actually using them, you're much better coordinating multiple things happening in your program. asyncio is the better alternative here. Each async "task" corresponds to a different thing happening in your program.

It sounds like what you want is to have a UI running (that writes to a file), with a background task that wakes up periodically and reads from the file and does something with the data. So the background task is just a loop that sleeps (using "await asyncio.sleep(...)" and does whatever it needs to do). Alternatively you can schedule one-off invocations of the background task using create_task.

The much more interesting part is how to write the UI, especially using asyncio. Definitely worth checking out Peter (@pythoncoder)'s work here:
- https://github.com/peterhinch/micropython-nano-gui
- https://github.com/peterhinch/micropython-tft-gui
- https://github.com/peterhinch/micropython-lcd160cr-gui

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Re: Can there be 2 python programs running at the same time?

Post by pythoncoder » Thu Apr 16, 2020 6:39 am

@donmerch The GUI libraries listed above use uasyncio so that touch events cause user-defined callbacks to run. You might have a uasyncio task performing your data handling, with touch events being handled concurrently. Conflicts over shared resources such as files can be handled by means of Lock objects.

If you're unfamiliar with asyncio you might like to look at this repo which includes a tutorial. This includes details of why it is almost always a better solution than threading.
Peter Hinch
Index to my micropython libraries.

ltmerlin
Posts: 39
Joined: Fri Jun 28, 2019 12:34 pm

Re: Can there be 2 python programs running at the same time?

Post by ltmerlin » Thu Apr 16, 2020 6:01 pm

@pythoncoder first, thanks for starting the fresh updates on the uasyncio v3 tutorial in your golden repo. While I am getting the hang of it, I do not understand the use of a Lock in async programming. I have used Locks in threading applications, but since everything in async code is running in one thread, one task scheduled at a time, I was wondering if you can give me an example when we need to use a Lock?

For better understanding I thought this must be used in, for example, the queue.py code. Say we have some "producer" tasks that put objects on a global queue at relative low speed and one mother task offloading this queue at a fast pace. Since the global queue is a shared resource, I thought I might find a "Lock" in the queue.py code, but there isn't... Can you help me understand when to use Locks to produce safe async code while using shared resources?

User avatar
jimmo
Posts: 2754
Joined: Tue Aug 08, 2017 1:57 am
Location: Sydney, Australia
Contact:

Re: Can there be 2 python programs running at the same time?

Post by jimmo » Fri Apr 17, 2020 3:50 am

ltmerlin wrote:
Thu Apr 16, 2020 6:01 pm
I have used Locks in threading applications, but since everything in async code is running in one thread, one task scheduled at a time, I was wondering if you can give me an example when we need to use a Lock?
You're right that asyncio-based code requires significantly less use of locks, however it's important to distinguish between "I am the only thread/task/thing running right this instance" and "I need exclusive access to some resource".

So in regular pre-emptive multithreaded programming, your code can be interrupted anywhere, so the first case is important. (This is why a concurrent Queue data structure would need locking). But with asyncio, other code can only run when you say it can.

Every time that you "await" something, you're giving another task the ability to run. So if you need to do some operation that might involve multiple await-ed steps, but you need to prevent another task from being able to run while you're awaiting, then you need a lock (or some sort of synchronization).

kevinkk525
Posts: 969
Joined: Sat Feb 03, 2018 7:02 pm

Re: Can there be 2 python programs running at the same time?

Post by kevinkk525 » Fri Apr 17, 2020 5:53 am

Easy example: mqtt uses sockets but multiple asyncio tasks can send messages. Each send operation might need multiple steps with "await" in between but the currently sending task has to prevent other asyncio tasks from using the socket until the transmission is finished. Therefore a lock is used.
Kevin Köck
Micropython Smarthome Firmware (with Home-Assistant integration): https://github.com/kevinkk525/pysmartnode

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

The need for Lock objects

Post by pythoncoder » Fri Apr 17, 2020 6:45 am

Kevin's point can be generalised to any communication protocol. An application might have multiple tasks accessing a single physical channel to communicate with a remote device. Each task needs exclusive access to the channel until a complete message has been processed. If the communication process is slow, the task will need to yield periodically while communication is in progress.

Hence a lock is required to ensure protocol integrity: a task holds the lock for the duration of a message.
Peter Hinch
Index to my micropython libraries.

ltmerlin
Posts: 39
Joined: Fri Jun 28, 2019 12:34 pm

Re: Can there be 2 python programs running at the same time?

Post by ltmerlin » Fri Apr 17, 2020 8:15 am

thanks for the replies!
jimmo wrote:So in regular pre-emptive multithreaded programming, your code can be interrupted anywhere, so the first case is important. (This is why a concurrent Queue data structure would need locking). But with asyncio, other code can only run when you say it can.
Do you mean that I cannot use safely the Queue class (e.g. this implementation of @pythoncoder based on Paul Sokolovsky's code) without using a Lock when multiple tasks want to put/get things on/from it?
kevinkk525 wrote:Easy example: mqtt uses sockets but multiple asyncio tasks can send messages. Each send operation might need multiple steps with "await" in between but the currently sending task has to prevent other asyncio tasks from using the socket until the transmission is finished. Therefore a lock is used.
pythoncoder wrote:Hence a lock is required to ensure protocol integrity: a task holds the lock for the duration of a message.
When I use a Lock in multiple tasks when they want to send something via the socket (a single physical channel), isn't there a possibility that one send request could take a long time and "blocking" all other tasks from sending/running during the Lock period? In my opinion isn't it better the put all events first on a shared Queue and 1 single task is responsible for sending the events over the socket one-by-one, not "blocking" the other tasks from running?

User avatar
pythoncoder
Posts: 5956
Joined: Fri Jul 18, 2014 8:01 am
Location: UK
Contact:

Lock vs Queue

Post by pythoncoder » Fri Apr 17, 2020 3:31 pm

When I use a Lock in multiple tasks when they want to send something via the socket (a single physical channel), isn't there a possibility that one send request could take a long time and "blocking" all other tasks from sending/running during the Lock period? In my opinion isn't it better the put all events first on a shared Queue and 1 single task is responsible for sending the events over the socket one-by-one, not "blocking" the other tasks from running?
There is no general answer to that one. Some communication protocols require a two-way exchange of data (e.g. send-acknowledge) so a queue wouldn't work. Each task must lock the channel until the exchange is complete. The possibility of deadlock in the event of the remote device failing to complete the handshake may be addressed by a timeout which releases the lock and does whatever is necessary to retrieve the situation.

Where the protocol is "send and forget" then send and (possibly) receive queues are another way to go. There is still a need to deal with failure of the remote end of the link, and to consider whether the queue sizes can ever suffer unconstrained growth.

Sockets can be used for various purposes, but in the network context TCP/IP deals with the gory details of message integrity so a queue would work. The question is whether it would serve any purpose. If one task is paused pending a response from the remote, in the absence of a lock other tasks can add data to the queue. Whether this is of benefit must be decided in the context of the application logic. In the case of reception from the node, there is probably no choice other than to wait.
Peter Hinch
Index to my micropython libraries.

Post Reply