pyABC: BUG: Can't pickle local object
When I run the tutorial, after running abc.run(...)
I receive the following pickle error:
AttributeError: Can't pickle local object 'ABCSMC._create_simulate_from_prior_function.<locals>.simulate_one'
Using Python 3.8 (Anaconda) with pyABC 0.10.1 (also tried 0.10.0 and 0.9.26 to the same result). cloudpickle 1.3.0, pickleshare 0.7.5, and dill 0.3.1.1 are installed, too. Running in Jupyter notebook, but I checked and it happens if I’m running the same code as a script.
Here is the full trace:
> history = abc.run(minimum_epsilon=0.2, max_nr_populations=5)
INFO:ABC:Calibration sample before t=0.
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-10-9ca3152d75f4> in <module>
----> 1 history = abc.run(minimum_epsilon=0.2, max_nr_populations=5)
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/smc.py in run(self, minimum_epsilon, max_nr_populations, min_acceptance_rate)
859 self._adapt_population_size(t0)
860 # sample from prior to calibrate distance, epsilon, and acceptor
--> 861 self._initialize_dist_eps_acc(t0)
862
863 # configure recording of rejected particles
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/smc.py in _initialize_dist_eps_acc(self, t)
418
419 # initialize dist, eps, acc (order important)
--> 420 self.distance_function.initialize(
421 t, get_initial_sum_stats, self.x_0)
422 self.acceptor.initialize(
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/distance/distance.py in initialize(self, t, get_all_sum_stats, x_0)
792
793 # execute function
--> 794 all_sum_stats = get_all_sum_stats()
795
796 self._calculate_normalization(all_sum_stats)
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/smc.py in get_initial_sum_stats()
398 """
399 def get_initial_sum_stats():
--> 400 population = self._get_initial_population(t)
401 # only the accepted sum stats are available initially
402 sum_stats = population.get_accepted_sum_stats()
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/smc.py in _get_initial_population(self, t)
462 else:
463 # sample
--> 464 population = self._sample_from_prior(t)
465 # update number of samples in calibration
466 self.history.update_nr_samples(
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/smc.py in _sample_from_prior(self, t)
523
524 # call sampler
--> 525 sample = self.sampler.sample_until_n_accepted(
526 self.population_size(-1), simulate_one,
527 max_eval=np.inf, all_accepted=True)
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/sampler/base.py in sample_until_n_accepted(self, n, simulate_one, max_eval, all_accepted)
149 def sample_until_n_accepted(
150 self, n, simulate_one, max_eval=np.inf, all_accepted=False):
--> 151 sample = f(self, n, simulate_one, max_eval, all_accepted)
152 if sample.n_accepted != n and sample.ok:
153 # this should not happen if the sampler is configured correctly
~/miniconda3/envs/SciComPy/lib/python3.8/site-packages/pyabc/sampler/multicore_evaluation_parallel.py in sample_until_n_accepted(self, n, simulate_one, max_eval, all_accepted)
111
112 for proc in processes:
--> 113 proc.start()
114
115 id_results = []
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/process.py in start(self)
119 'daemonic processes are not allowed to have children'
120 _cleanup()
--> 121 self._popen = self._Popen(self)
122 self._sentinel = self._popen.sentinel
123 # Avoid a refcycle if the target function holds an indirect
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/context.py in _Popen(process_obj)
222 @staticmethod
223 def _Popen(process_obj):
--> 224 return _default_context.get_context().Process._Popen(process_obj)
225
226 class DefaultContext(BaseContext):
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/context.py in _Popen(process_obj)
281 def _Popen(process_obj):
282 from .popen_spawn_posix import Popen
--> 283 return Popen(process_obj)
284
285 class ForkServerProcess(process.BaseProcess):
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/popen_spawn_posix.py in __init__(self, process_obj)
30 def __init__(self, process_obj):
31 self._fds = []
---> 32 super().__init__(process_obj)
33
34 def duplicate_for_child(self, fd):
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/popen_fork.py in __init__(self, process_obj)
17 self.returncode = None
18 self.finalizer = None
---> 19 self._launch(process_obj)
20
21 def duplicate_for_child(self, fd):
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/popen_spawn_posix.py in _launch(self, process_obj)
45 try:
46 reduction.dump(prep_data, fp)
---> 47 reduction.dump(process_obj, fp)
48 finally:
49 set_spawning_popen(None)
~/miniconda3/envs/SciComPy/lib/python3.8/multiprocessing/reduction.py in dump(obj, file, protocol)
58 def dump(obj, file, protocol=None):
59 '''Replacement for pickle.dump() using ForkingPickler.'''
---> 60 ForkingPickler(file, protocol).dump(obj)
61
62 #
AttributeError: Can't pickle local object 'ABCSMC._create_simulate_from_prior_function.<locals>.simulate_one'
Please advise… Thanks
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 15 (9 by maintainers)
Dear @yoavram , @vedantchandra the Multicore samplers should be usable now on MacOS in https://github.com/ICB-DCM/pyABC/pull/320. They now have an additional
pickle
argument, which you can manually set to False if the system allows it. If it is not set explicitly, we now assume that manual pickling is necessary on.I suggest that if it is not fixed (not a must, I guess, though some Max’s now have quite a lot of cores) then at least have the default sampler switch to a single core sampler instead of crashing.
Hi @vedantchandra it may be that in python3.8,
multiprocessing
employsspawn
instead offork
, leading to the here observed failure as the local function_create_simulate_from_prior_function()
is not pickleable. This is reported here: https://github.com/huge-success/sanic/issues/1774To circumvent this, one could here wrap the arguments using
cloudpickle.dumps
(which allows pickling local objects), and unpack manually in here usingcloudpickle.loads
. On linux systems, or, wheneverfork
is used, this should however not be done, as this will lead to some performance loss if we have to pickle and unpickle things. If it is of importance for you to use parallelization on macos, I can have a look at this in the next days.