Sure, your approach seems fine.
There is an alternative way of looking at the question.
for (int i = 0; i < n; i++) { // loop 1
for (int j = i; j > 0; j /= 2){ // loop 2
std::cout << j << endl;
}
}
Analysis of Loop 1
Nothing to see here, straightforward O(n)
Analysis of Loop 2
A little more interesting, given a number j(=i) it is a race to see how many times you can divide by 2 before you get to 0. Which is of course nothing but the log base 2 of the number.
Combining both analyses for each n you perform a log(n,2) -> log(number, base).
i.e O(n * log(n,2)).
by something called Stirling's Approximation you can prove that O(nlogn) is O(log(n!)).
Now something that needs to be pointed out is, you have asked for T(n) which is slightly different O(n).
T(n) is something that you use to mathemtaically denote the algorithm relative to it's next processing steps and the current step.
For Something like merge sort
T(n) = 2 * T(n/2)(divide) + O(n)(conquer).
In our case I would write T(n) as
T(n) = T(n + 1)(next step) + O(logn)(race to zero) , 0 <= n < N
= 0 , n >= N
(I'm aware T(n) is typically denoted in terms of previous terms and not future terms but I am just outlining a point, it isn't a rigorous mathematical definition)