There isn't an actual infinite loop, because the __call__ method is not actually invoked ("called") for all of those situations. It's only invoked directly when there is a function-like call on an object that provides a __call__ method.
Normal class instantiation Cls(...) and regular functional invocation f() are known cases that are handled directly. There generally is not an actual invocation of __call__(), so there are a finite number of __call__ method invocations that can ever occur, even in complex cases with deep inheritance, metaclasses, etc.
Because there was some dispute as to whether the short-circuiting of conceptual infinite loops was really happening, let's look at the disassembled bytecode. Consider the following code:
def f(x):
return x + 1
class Adder(object):
def something(self, x):
return x + 19
def __call__(self, x):
return x + 1
def lotsacalls(y):
u = f(1)
a = Adder()
z = u + a.something(y)
return a(z * 10)
Sorry it's a little complex, as I want to show several instances of short-circuiting--namely, normal def functions, __init__ calls, normal methods, and __call__ special methods. Now:

So here are a range of times when, if Python were really, truly "walking the tree" of conceptual __call__ invocations, it would to reference Function (and possibly Method classes, and invoke their __call__ methods). It doesn't. It uses the simple bytecode CALL_FUNCTION in all cases, short-circuiting the conceptual tree-walk down. Logically you can imagine that there is a class Function that has a __call__ method that's invoked when a function (i.e. an instance of the Function class) is called. But it doesn't really work that way. The compiler, bytecode interpreter, and other parts of the C-language underpinnings do not actually walk meta-class trees. They short-circuit like crazy.