I'm working on a new optimizer, and I managed to work out most of the process. Only thing I'm stuck on currently is finding gen_training_ops.
Apparently this file is crucial, because in both implementations of Gradient Descent, and Adagrad optimizers they use functions that are imported out of a wrapper file for gen_training_ops (training_ops.py in the python/training folder).
I can't find this file anywhere, so I suppose I don't understand something and search in the wrong place. Where can I find it? (Or specifically the implementations of apply_adagrad and apply_gradient_descent)
Thanks a lot :)