Python – Timeout decorator classes with multiprocessing produce pickling errors

Timeout decorator classes with multiprocessing produce pickling errors… here is a solution to the problem.

Timeout decorator classes with multiprocessing produce pickling errors

So on Windows signal and thread approahc are usually bad ideas/not suitable for function timeouts.

I wrote the following timeout code that throws a timeout exception from multiprocessing when the code takes a long time. This is exactly what I wanted.

 def timeout(timeout, func, *arg):
    with Pool(processes=1) as pool:
        result = pool.apply_async(func, (*arg,))
        return result.get(timeout=timeout)

I’m now trying to make it decorator-style so that I can add it to a wide range of functions, especially those that call external services and I have no control over the code or duration. My current attempts are as follows:

class TimeWrapper(object):

def __init__(self, timeout=10):
        """Timing decorator"""
        self.timeout = timeout

def __call__(self, f):
        def wrapped_f(*args):
            with Pool(processes=1) as pool:
                result = pool.apply_async(f, (*args,))
                return result.get(timeout=self.timeout)

return wrapped_f

It gives pickling error:

@TimeWrapper(7)
def func2(x, y):
    time.sleep(5)
    return x*y

File "C:\Users\rmenk\AppData\Local\Continuum\anaconda3\lib\multiprocessing\reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
_pickle. PicklingError: Can't pickle <function func2 at 0x000000770C8E4730>: it's not the same object as __main__.func2

I

suspect this is due to multi-processing and decorators not working well, but I don’t actually know how to make them work well. Any ideas on how to fix this?

PS: I’ve done some extensive research on this site and elsewhere but haven’t found any valid answers, whether using pebble, threading, as a function decorator, or something else. If you know of a solution that works on Windows and Python 3.5, I’d be more than happy to use it.

Solution

What you want to achieve is especially troublesome in Windows. The core problem is that when you decorate a function, you obscure it. This happens to work well in UNIX because it uses >fork 。 Create a policy for creating a new process.

But in Windows, the new process will be a blank process where a brand new Python interpreter will be launched and your module loaded. When the module is loaded, the decorator hides the real function, making the pickle protocol difficult to find.

The only correct way is to rely on the trampoline function set up during decoration. You can see how to do this on pebble, however, As long as you’re not doing it for practice, I recommend going straight to Pebble because it already offers what you’re looking for.

from pebble import concurrent

@concurrent.process(timeout=60)
def my_function(var, keyvar=0):
    return var + keyvar

future = my_function(1, keyvar=2)
future.result()

Related Problems and Solutions