Assuming m <= n, can you reduce O(nm) to O(n)?
I would think not, since m can be equal to n
In the case of m < n, I would think that O(nm) can be reduced to O(n) since m is strictly less than n and hence becomes insignificant as n approaches infinity
Am I correct in my above assumptions? If so, why? What is the more formal way of showing this?
 
     
     
     
    