I am calling a function that starts a process that takes longer to execute, many different things are done. This function chiefly handles instances of a particular class, Item. These items are categorized by different attributes: category1, category2 and category3.
Now, there is a different model that applies some sort of rules to these categories: Rule with many-to-many attributes: categories1, categories2 and categories3. A rule applies to an Item, if the same rule points to different categories, only one of them should be applied. The decision of which one is defined by a certain logic encapsulated in a function:
class Rule(models.Model):
    warehouse = models.ForeignKey('Warehouse')
    categories1 = models.ManyToManyField('Category1')
    categories2 = models.ManyToManyField('Category2')
    categories3 = models.ManyToManyField('Category3')
    @staticmethod
    def get_rules_that_applies(item):
        rules = warehouse.rule_set.all()
        if not rules.exists():
            return None
        # ... determine which rule applies to the item by filtering, etc.
        return rule
The issue lies in the get_rules_that_applies method. Every time we need to get the rule that applies to a certain item and let me say again that many many items are involved in the process we are talking about, warehouse.rule_set.all() is called.
Since the rules will not change during this process, we can just cache all the rules in the ware house, but how? How can I make sure warehouse = warehouse.rule_set.all() is cached and all filtering and QuerySet operations that act on these rules are not hitting the database?
 
     
    