I have a dataframe df1:
Date_1     Date_2       i_count c_book
01/09/2019  02/08/2019  2       204
01/09/2019  03/08/2019  2       211
01/09/2019  04/08/2019  2       218
01/09/2019  05/08/2019  2       226
01/09/2019  06/08/2019  2       234
01/09/2019  07/08/2019  2       242
01/09/2019  08/08/2019  2       251
01/09/2019  09/08/2019  2       259
01/09/2019  10/08/2019  3       269
01/09/2019  11/08/2019  3       278
01/09/2019  12/08/2019  3       288
01/09/2019  13/08/2019  3       298
01/09/2019  14/08/2019  3       308
01/09/2019  15/08/2019  3       319
01/09/2019  16/08/2019  4       330
01/09/2019  17/08/2019  4       342
01/09/2019  18/08/2019  4       354
01/09/2019  19/08/2019  4       366
01/09/2019  20/08/2019  4       379
01/09/2019  21/08/2019  5       392
01/09/2019  22/08/2019  5       406
01/09/2019  23/08/2019  6       420
01/09/2019  24/08/2019  6       435
01/09/2019  25/08/2019  7       450
01/09/2019  26/08/2019  8       466
01/09/2019  27/08/2019  9       483
01/09/2019  28/08/2019  10      500
01/09/2019  29/08/2019  11      517
01/09/2019  30/08/2019  12      535
01/09/2019  31/08/2019  14      554
I want to expand the dataset based on i_count. i_count is the count of rows to be replicated. so lets say if i_count = 2 implies that 2 rows need to be replicated for the same. 
Also, I want to create a new column c_book_i such that c_book should be divided within the entries in the dataset. for example, if i_count = 2, signifies that new dataframe should have 2 entries and c_book_i should have 2 entries such that sum(c_book_i) = c_book. The last constraint is that I want to have c_book_i > 10 in all the cases. 
so Far :
def f(x):
    i = np.random.random(len(x))
    j = i/sum(i) * x
    return j
joined_df2 = df1.reindex(df1.index.repeat(df1['i_count']))
joined_df2['c_book_i'] = joined_df2.groupby(['Date_1','Date_2'])['c_book'].transform(f)
This provides me the same but without the check that the c_book should be greater than 10. there are a lot of values coming less than 10.
Can anyone help with the same.
Thanks
 
     
     
    