How can I share a nested object between Python processes with write access for tasklets(coroutines)?
Here is a simplified example with an analogy just I wrote for asking this question properly;
First of all please install greenlet package with: sudo pip install greenlet
In the example below:
- An instance of 
Natureclass referenced byhabitatvariable - This instance of 
Natureclass has an instance variable calledanimals - While the initiation of this instance of 
Natureclass, 8 different instances ofAnimalclass created and appended toanimalsinstance variable. Now if I'm correct this instance ofNatureis a nested object. - As the last step 
liveinstance functions ofAnimalinstances randomly switching usinggreenletpackage'sswitch()function untilglobal_counterreaches to 1000. Thislivefunction randomly changing the value oflimbsinstance variable ofAnimalinstances. 
greentest.py:
import random
from greenlet import greenlet
global_counter = 0
class Animal():
    def __init__(self,nature):
        self.limbs = 0
        nature.animals.append(self)
        self.tasklet = greenlet(self.live)
    def live(self,nature):
        global global_counter
        while True:
            self.limbs = random.randint(1, 10)
            global_counter += 1
            if global_counter > 1000:
                break
            random.sample(nature.animals,1)[0].tasklet.switch(nature)
class Nature():
    def __init__(self,how_many):
        self.animals = []
        for i in range(how_many):
            Animal(self)
        print str(how_many) + " animals created."
        self.animals[0].live(self)
The result is:
>>> import greentest
>>> habitat = greentest.Nature(8)
8 animals created.
>>> habitat.animals[0].limbs
3
>>> greentest.global_counter
1002
Working as expected. Changing the value of limbs and global_counter (non-zero)
But when I add multiprocessing to the equation;
greentest2.py:
import random
import multiprocessing
from greenlet import greenlet
global_counter = 0
class Animal():
    def __init__(self,nature):
        self.limbs = 0
        nature.animals.append(self)
        self.tasklet = greenlet(self.live)
    def live(self,nature):
        global global_counter
        while True:
            self.limbs = random.randint(1, 10)
            global_counter += 1
            if global_counter > 1000:
                break
            random.sample(nature.animals,1)[0].tasklet.switch(nature)
class Nature():
    def __init__(self,how_many):
        self.animals = []
        for i in range(how_many):
            Animal(self)
        print str(how_many) + " animals created."
        #self.animals[0].live(self)
        jobs = []
        for i in range(2):
            p = multiprocessing.Process(target=self.animals[0].live, args=(self,))
            jobs.append(p)
            p.start()
The result is not as expected:
>>> import greentest2
>>> habitat = greentest2.Nature(8)
8 animals created.
>>> habitat.animals[0].limbs
0
>>> greentest2.global_counter
0
Both the values of limbs and global_counter is unchanged (zero). I think this is because instances of Animal class and global_counteris not shared between processes. So how can I share this instance of Nature class or these instances of Animal class between processes?
ADDITION according to @noxdafox 's answer;
greentest3.py:
import random
import multiprocessing
from greenlet import greenlet
global_counter = multiprocessing.Value('i', 0)
class Animal():
    def __init__(self,nature):
        self.limbs = 0
        nature.animals.append(self)
        self.tasklet = greenlet(self.live)
    def live(self,nature):
        global global_counter
        while True:
            self.limbs = random.randint(1, 10)
            global_counter.value += 1
            if global_counter.value > 1000:
                break
            random.sample(nature.animals,1)[0].tasklet.switch(nature)
class Nature():
    def __init__(self,how_many):
        self.animals = []
        for i in range(how_many):
            Animal(self)
        print str(how_many) + " animals created."
        #self.animals[0].live(self)
        jobs = []
        for i in range(2):
            p = multiprocessing.Process(target=self.animals[0].live, args=(self,))
            jobs.append(p)
            p.start()
and then result is:
>>> import greentest3
>>> habitat = greentest3.Nature(8)
8 animals created.
>>> habitat.animals[0].limbs
0
>>> greentest3.global_counter.value
1004
I was perfectly aware that global_counter can be shared with this method since it's an integer but I'm actually asking how to share the instances of Nature and Animal classes between processes.