First time in my 7 years of development experience I came across a use case where I need to update thousands of rows at a time in mysql database. I'm using KnexJs as ORM in nodeJs.
My query is as follows
await db.transaction(trx => {
  const queries = collection.map(tuple => {  
    return db('table')
      .where('id', id)
      .update(tuple)
      .transacting(trx)
  });
  return Promise.all(queries)
    .then(trx.commit)    
    .catch(trx.rollback);
});
Now let's say my collection is an array of a million then it will take forever to complete and user can't stay on site that much. I am open to any cache plugin like redis or elasticsearch but I don't think they will be helpful in this use case but again I never used such solution ever but I'm open to solutions involving any third party
as far as I can insert / update thousands (if not million) of queries within seconds.
Clearity of what I want
In the end I am sending JSON (containing updating data of different records) from postman to nodeJs backend which result in several minutes wait. I want to reduce this timing to a couple of seconds (2 - 8 seconds). Any suggestion is appreciated Thank You
