blhsing's answer identifies the root of the problem: on Windows, multiprocessing requires running a new instance of Python for each computational process.  Each new Python loads the file(s) that define the various functions, then waits for directives from the master / controlling Python that spawned them—but if the Python file(s) that multiprocessing loads spawn additional Pythons unconditionally, without an if __name__ == '__main__' test, those additional Pythons spawn more Pythons which spawn yet more Pythons, without end.
(Essentially, the problem here is recursion without a base case.)
Prune's answer, suggesting memoization, is also reasonable.  Note that memoization can be done without global variables.  See What is memoization and how can I use it in Python? for a prepackaged version.  One I like to use as a demo makes use of the fact that you can set attributes on functions:
def fibo(n):
    if n <= 1:
        return 0 if n < 1 else 1
    ns = str(n)
    if not hasattr(fibo, ns):
        setattr(fibo, ns, fibo(n - 1) + fibo(n - 2))
    return getattr(fibo, ns) 
We handle the base cases up front to avoid recursion.  Then we turn the argument n (which is presumably a number) into a string ns for getattr and setattr.  If the memoized answer is not available, we set it with a recursive call; then we return the memoized answer.