1 year ago
#388183

DerekG
Route messages from many processes to a single process without passing python object queue
High level goal:
I'm writing a python code base that will have many processes and threads running, some of which are child processes of each other and others which are not (e.g. independently started in another terminal). Each of these processes needs to write "log" messages eventually to a database. However, I'd prefer for these writes to be non-blocking for the time-sensitive processes, so I want to pass log messages to a log "server" from each "client" process, and then the server can do the blocking writes to the database. I want there to be exactly one database writing server process active at any time, which for now I assume I will initialize manually.
I can envision a few ways to pass information on to the server process.
- I could create a python multiprocessing shared queue or pipe and pass this object every time a new process is initialized. This is not preferable because it means that every arbitrary process function must be written with an additional log queue argument, and furthermore the processes would have to be structured such that they all descend from a single ancestor process.
- I could use a static address and port stored as an operating system environment port on which the server process listens. Each client process would send log messages to this address. To be non-blocking or low-latency these message sends would likely have to be UDP, meaning delivery would not be guaranteed.
Question:
Is there a middle ground, that allows for the creating of a C-style queue or pipe that can be referenced by any python process (e.g. at some static file location) without needing to explicitly pass it as a python object input to that process?
python
logging
multiprocessing
message-queue
0 Answers
Your Answer