Consider the following hypothetical code:
class B(object):
    def __init__(self):
        self.b = 2
    def foo(self):
        out1 = [eval('self.b')]    # ok
        print(out1)                # prints: [2]
        out2 = [eval(cmd) for cmd in ['self.b']]    # fails
        print(out2)    # NameError: name 'self' is not defined
b = B()
b.foo()
Why is the statement for out1 ok, but not for out2, which gives the error "'self' is not defined"?
I am learning Python, and I came about this problem whilst experimenting about eval. Yes, I know the use of eval in this example is inappropriate, but just for the sake of taking this example at face value, can someone explain why the statement for out2 gives out the error message? It seems both statements should work and give the same result.
Thank you for any guidance.
 
    