I am using Squish to automate a Qt-based GUI application. I look up qt objects in the application recursively. Since it's time-intensive, I would like to cache objects once found for later reuse. I have the below class to maintain a cache of objects in a dictionary -
    
    def __init__(self):
        self.object_store = {}
    
    @staticmethod
    def instance():
        if '_instance' not in ObjectCache.__dict__:
            ObjectCache._instance = ObjectCache()
        return ObjectCache._instance
    
    def set(self, object_name, obj):
        self.object_store[object_name] = obj
    
    def remove(self, object_name):
        del self.object_store[object_name]
    
    def exists(self, object_name):
        if object_name in self.object_store:
            return True
        return False
    
    def get(self, object_name):
        return self.object_store[object_name]
        
    def get_all(self):
        return self.object_store
I have below decorator for functions in my automation scripts to add/access/delete from this dictionary -
def object_caching_decorator(func):
    def wrapper(*args, **kwargs):
        object_cache = ObjectCache.instance()
        if object_cache.exists(func.__name__):
            try:
                if waitForObject(object_cache.get(func.__name__)):
                    return object_cache.get(func.__name__)
            except LookupError:
                object_cache.remove(func.__name__)
        obj = func(*args, **kwargs)
        object_cache.set(func.__name__, obj)
        return obj
    return wrapper
One might ask why can't all scripts share this class object? because the Squish tool resets the global symbol table before starting every test script hence I need a way to persist this object.
How do I keep this class running so that the scripts running on another process (Squish runner) can access it seamlessly?