I am having trouble understanding this question and how to get to the answer. How do you calculate the worst case run time?
The input of the following programs is an array A containing n integers A[ 1 ] · · · A[n]. Bound the worst case run time of each program using big-O notation.
Problem 1:
i = 1, total = 0 
while i < n/2:
    total = total + A[i] 
    i=i*2
Problem 2:
total = 0
S = the set {1,2,3,4...n} 
for each subset T of S
    for each element x in T
          total = total + A[x]
Problem 3:
int i = 1, j = 1; 
    for i = 1 to n:
          while (j < n && A[i] < A[j])
             j++
The prefix sum of an array of numbers A[ 1 ], . . . , A[n] is a second array B[ 1 ], . . . , B[n] where
B[i] = summation from j=1 to i A[j]
The following problems calculate the prefix sum:
Problem 4:
for i = 1 to n: 
    B[i] = 0;
    for j = 1 to i:
          B[i] += A[j]
Problem 5:
B[1] = A[1]; 
for i = 2 to n:
      B[i] = B[i-1] + A[i]
 
    