I have a model in pytorch. The model can take any shape but lets assume this is the model
torch_model =  Sequential(
    Flatten(),
    Linear(28 * 28, 256),
    Dropout(.4),
    ReLU(),
    BatchNorm1d(256),
    ReLU(),
    Linear(256, 128),
    Dropout(.4),
    ReLU(),
    BatchNorm1d(128),
    ReLU(),
    Linear(128, 10),
    Softmax()
)
I am using SGD optimizer, I want to set the gradient for each of the layers so the SGD algorithm will move the parameters in the direction I want.
Lets say I want all the gradients for all the layers to be ones (torch.ones_like(gradient_shape)) how can I do this?
Thanks?
 
    