I'm trying to write 15 Mb worth of data to a RDS instance with a PostgreSQL database, however I observe that this is really slow... It took about 15+ minutes to completely write all the data into the instance. Anyone has experienced with writing a lot of data row by row to an RDS instance? Thank you!
# Assuming the table is already created
def handler(file_with_many_many_rows, con):
    cur = con.cursor()
    reader = csv.reader(f)
    for i, line in enumerate(reader):
        vals = ("val1", "val2", "val3")
        insert_query = "INSERT INTO table_test VALUES (%s, %s, %s);"
        cur.execute(insert_query, vals)
    con.commit()
 
     
    