0

I'm trying to parallelize a task using python multiprocessing, but it results that when I need to append a object to a list (In concrete one minuit object from iminuit library) it gives an error "...cannot be converted to a Python object for pickling". I show here a code that normally works but not with those objects.

from multiprocessing import Process, Manager
from iminuit import Minuit


something=Minuit(lambda x: x) #notice that here object() or any str will work.

def dothing(L, i):  # the managed list `L` passed explicitly.
    L.append(something)

if __name__ == "__main__":
    with Manager() as manager:
        L = manager.list() 
        processes = []
        for i in range(5):
            p = Process(target=dothing, args=(L,i))  # Passing the list
            p.start()
            processes.append(p)
        for p in processes:
            p.join()
        print L

Reading the documentation of multiprocessing I've not find a correct way to do it. Thanks in advance.

pd: in the not parallel case, it is easy to make a list and append those object, so I don't understand why normally you can append it but it doesn't work here.

martineau
  • 99,260
  • 22
  • 139
  • 249
  • Sharing data between processes involves it being transmitted between the processes involved because each process runs in its own memory space so global variables can't be used. Python does it by pickling and unpickling data object. This means that one can only share objects that are picklable. Apparently that's not the case here… – martineau Apr 27 '20 at 13:12
  • Read [python-serialization-why-pickle](https://stackoverflow.com/questions/8968884) – stovfl Apr 27 '20 at 13:18

0 Answers0