I am working on a dataset and I applied to_json() method to export a pandas dataframe in a json file. I will then use this json file to upload to MongoDB. However I realise that the datetime format was converted to epoch timestamp. How do I retain the datetime format while exporting to JSON file as well as uploading the json file to MongoDB? Also, I do not want null fields in json output.
df:
    user_id     datetime
0   69490717    [{'checkin_date': 2021-02-01 00:00:00}]
1   67125777    [{'checkin_date': 2021-02-01 00:00:00}]
2   62747294    NaN
3   63216896    [{'checkin_date': 2021-02-01 00:00:00}]
4   51466797    [{'checkin_date': 2021-01-31 00:00:00}]
... ... ...
96  82758550    NaN
97  44662827    NaN
98  36376189    [{'checkin_date': 2021-01-18 00:00:00}]
99  71910948    [{'checkin_date': 2021-01-18 00:00:00}, {'checkout_date': 2021-01-20 00:00:00}]
100 54620533    NaN
Snippet of json output:
[{"user_id":62507249,"datetime":[{"checkin_date":1612051200000},{"checkout_date":1612051200000}]}, 
{"user_id":69546481,"datetime":[{"checkin_date":1612051200000}]}, ......]
Below is my code for converting to json:
jsonresult = df.T.apply(lambda row: row[~row.isnull()].to_json())
json_wrapped = "[%s]" % ",".join(jsonresult)
# write JSON to file
with open('jsonresult.json', 'w') as f:
    f.write(json_wrapped)
 
     
    