In order to converge to the optimum properly, there have been invented different algorithms that use adaptive learning rate, such as AdaGrad, Adam, and RMSProp. On the other hand, there is a learning rate scheduler such as power scheduling and exponential scheduling.
However, I don't understand at what kind of situations you should use one over the other. I feel that using adaptive learning rate optimization algorithm such as Adam is simpler and easier to implement than using learning rate scheduler.
So how can you use it apart properly, depending on what kind of problems?