Is it possible to perform minibatch gradient descent in sklearn for logistic regression? I know there is LogisticRegression model and SGDClassifier (which can use log loss function). However, LogisticRegression is fitted on whole dataset and SGDClassifier is fitted sample-by-sample (feel free to correct that statement, but this is how I understand stochastic gradient descent).
There is also partial_fit method, but that is available only for SGD. I believe that if I use partial_fit for SGD it will update weights each time it goes over next dataset sample (just like normal fit method). So if I provide chunk of 10 samples to partial_fit it does 10 updates - but that is not what I want.
What I need to get, is to update weights after each nth sample, just like in minibatch gradient descent. From what I read about LogisticRegression it can use something called warm_start which means, that weights from previous fit method are set as initial for current fit.
If this theory about warm_start is true, could I just use fit method multiple times, each time only on one minibatch? Or is there any other way to do minibatch gradient descent in sklearn?
I found this question which is very similar, except it does not discuss the warm_start idea, so that is why I asked again.