I was asked below program to code and have coded to a good extent. But not able to simulate 0 case consideration, Where am I missing? Below is the question and code:
Let 0 represent ‘A’, 1 represents ‘B’, etc. Given a digit sequence, count the number of possible decodings of the given digit sequence.
Input: digits[] = "121" Output: 3 ( The possible decodings are "BCB", "BV", "MB" ) Similarly "200" can be interpreted as "caa" or "ua" and "007" has one.
My code:
def countValidSequences(input_num):
    n = len(input_num)
    number = list(input_num)
    occurance = [0] * (n+1) 
    occurance[0] = 1
    occurance[1] = 1
    for i in range(2, n+1):
        occurance[i] = 0
        if (number[i-1] > '0'):
            occurance[i] += occurance[i-1]
        if (number[i-2] < '2' or (number[i-2] <= '2' and number[i-1] < '6') ):
            occurance[i] += occurance[i-2]
    return occurance[n]
print("Count ",countValidSequences("200"))
print("Count ",countValidSequences("2563"))
print("Count ",countValidSequences("123"))
print("Count ",countValidSequences("99"))
print("Count ",countValidSequences("100200300"))
O/P:
Count  1
Count  2
Count  3
Count  1
Count  3
It works fine for the input not having 0, any idea where am I missing?
 
    