1 year ago

#309928

test-img

Peedaruos

Shared memory to load same data (numpy array) to multiple MPI process?

I have a long skinny numpy array (dim=(4096*4096,1)) which needs to be read by multiple MPI processes (using mpi4py) and they do some operations on them independently. But while loading such large array by each process should be heavy on memory. Is there a way to have/use shared memory (maybe be it is allocated initially and not touched afterwards other than only the MPI processes will read from the same location, i.e. read-only access)? It maybe be possible with python-multiprocessing but what about mpi4py (thanks in advance)?

python

multiprocessing

shared-memory

mpi4py

0 Answers

Your Answer

Accepted video resources