I am preparing for my exam and having a little trouble on run time analysis. I have 2 methods below that I am confused on the run time analysis for:
 public boolean findDuplicates(String [] arr) {
    Hashtable<String,String> h = new Hashtable<String,String>();
    for (int i = 0; i < arr.length; i++) {
         if (h.get(arr[i]) == null)
              h.put(arr[i], arr[i]);
         else
              return true;
         }
    return false;
    }
Assuming that hash function only takes O(1) on any key, would the run time simply be O(n) due to in worst case, running through the entire array? Am I thinking of this along the right lines if each hash function takes constant time to evaluate?
The other problem I have seems much more complicated and I don't know exactly how to approach this. Assume these are arrarlists.
public boolean makeTranslation(List<Integer> lst1, List<Integer> lst2) {
//both lst1 and lst2 are same size and size is positive
     int shift = lst1.get(0) - lst2.get(0);
     for (int i = 1; i < lst1.size(); i++)
          if ( (lst1.get(i) - lst2.get(i)) != shift)
               return false;
     return true;
}
In this case, the get operations are supposed to be constant since we are simply retrieving a particular index values. But in the for loop, we are both comparing it to shift and also iterating over all elements. How exactly would this translate to run time?
A helpful explanation would be much appreciated since I have the hardest time understanding run time analysis than anything in this course and my final is next week.
 
     
     
     
     
    