I'm looking for the object oriented equivalent of this function:
def lazy_gen_func(path):
    for line in open(path): 
        for token in line.split():
            yield token
Related answers suggest the following approach:
class eager_gen_obj(object):
    def __init__(self, path):
        f = open(path)
        self.text = [token for line in f for token in line.split()]
        self.index = 0
    def __iter__(self):
        return self
    def __next__(self):
        try:
            result = self.text[self.index]
        except IndexError:
            raise StopIteration
        self.index += 1
        return result
With the downside that the full source file has to be loaded in memory when __init__ is called.
How do I create a custom generator object to lazily flatten nested source data?
 
    