I'm trying to instantiate objects using imported modules. To make these imports process safe (since I'm on windows), I'm using the import statements inside the if __name__ == '__main__': block.
My files look somewhat like this:
# main.pyfrom multiprocessing import Process# target func for new processdef init_child(foo_obj, bar_obj): passif __name__ == "__main__": # protect imports from child processfrom foo import getFoo from bar import getBar # get new objects foo_obj = getFoo() bar_obj = getBar() # start new process child_p = Process(target=init_child, args=(foo_obj, bar_obj)) child_p.start() # Wait for process to join child_p.join()
# foo.pyimport osprint 'Foo Loaded by PID: ' + str(os.getpid())class Foo: def __init__(self): passdef getFoo(): # returning new instance of classreturn Foo()
# bar.pyimport osprint 'Bar Loaded by PID: ' + str(os.getpid())class Bar: def __init__(self): passdef getBar(): # not returning a new instancereturn 'bar'
Foo Loaded by PID: 58760Bar Loaded by PID: 58760Foo Loaded by PID: 29376
The output I get indicates that the foo module was loaded twice. I understand that the interpreter executes the main module again (since Windows does not support the fork system call), but what's odd is that it was imported inside the __main__ block.
It might be an issue when sharing objects; like Queues imported from a dedicated module. Any ideas what might cause this?