fabric: Win32: Can't use @parallel: pickle.PicklingError: it's not found as fabric.tasks.inner
Greetings, First of all thanks a lot for a really great tool. But I have a problem with parrallel executions.
My setup:
- ActivePython 2.7.2.5 (ActiveState Software Inc.) based on Python 2.7.2 (default, Jun 24 2011, 12:21:10) [MSC v.1500 32 bit (Intel)] on win32
- Fabric 1.3.3 (installed via
pypm install fabric
) - WinXP
I have a script wich works perfectly with serial executions. But when I add @parallel
decorator to one of the tasks, for ex:
@roles('app', 'db')
@parallel
def checkout_or_update(branch):
path = podman_sources + branch
if not exists(path):
with cd(podman_sources):
run(checkout_command % {"branch" : branch})
else:
with cd(path):
run('svn up')
the I get something like that:
$ fab deploy:trunk,staging
[192.168.102.183] Executing task 'checkout_or_update'
[192.168.102.169] Executing task 'checkout_or_update'
[192.168.102.167] Executing task 'checkout_or_update'
Traceback (most recent call last):
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\main.py", line 682, in
main
*args, **kwargs
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 248, in execute
task.run(*args, **new_kwargs)
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 105, in run
return self.wrapped(*args, **kwargs)
File "D:\Mishail\My Documents\wspace\app\fabfile.py", line 122, in deploy
execute(checkout_or_update, branch)
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 237, in execute
exitcodes = jobs.run()
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\job_queue.py", line 124, in run
_advance_the_queue()
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\job_queue.py", line 114, in _advance_the_queue
job.start()
File "D:\Technologies\Python27\lib\multiprocessing\process.py", line 130, in start
self._popen = Popen(self)
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 271, in __init__
dump(process_obj, to_child, HIGHEST_PROTOCOL)
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 193, in dump
ForkingPickler(file, protocol).dump(obj)
File "D:\Technologies\Python27\lib\pickle.py", line 224, in dump
self.save(obj)
File "D:\Technologies\Python27\lib\pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "D:\Technologies\Python27\lib\pickle.py", line 419, in save_reduce
save(state)
File "D:\Technologies\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "D:\Technologies\Python27\lib\pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
File "D:\Technologies\Python27\lib\pickle.py", line 681, in _batch_setitems
save(v)
File "D:\Technologies\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "D:\Technologies\Python27\lib\pickle.py", line 748, in save_global
(obj, module, name))
pickle.PicklingError: Can't pickle <function inner at 0x00F73F70>: it's not found as fabric.tasks.inner
$ Traceback (most recent call last):
File "<string>", line 1, in <module>
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 374, in main
self = load(from_parent)
File "D:\Technologies\Python27\lib\pickle.py", line 1378, in load
return Unpickler(file).load()
File "D:\Technologies\Python27\lib\pickle.py", line 858, in load
dispatch[key](self)
File "D:\Technologies\Python27\lib\pickle.py", line 880, in load_eof
raise EOFError
EOFError
About this issue
- Original URL
- State: closed
- Created 13 years ago
- Comments: 23 (7 by maintainers)
@bitprophet , I am using Fabric 1.10 and python 2.7 on Windows 7. When I use the “parallel” decorator, I am getting the above mentioned PicklingError. I was wondering if this issue has been resolved as this ticket has been closed.