PyTorch LR Scheduler - Adjust The Learning Rate For Better Results
In this PyTorch Tutorial we learn how to use a Learning Rate (LR) Scheduler to adjust the LR during training.
#more
In this PyTorch Tutorial we learn how to use a Learning Rate (LR) Scheduler to adjust the LR during training. Models often benefit from this technique once learning stagnates, and you get better results. We will go over the different methods we can use and I'll show some code examples that apply the scheduler.
torch.optim.lr_scheduler
provides several methods to adjust the learning rate based on the number of epochs.- `torch.optim.lr_scheduler.ReduceLROnPlateau allows dynamic learning rate reducing based on some validation measurements.
Documentation:¶
https://pytorch.org/docs/stable/optim.html
FREE VS Code / PyCharm Extensions I Use
✅ Write cleaner code with Sourcery, instant refactoring suggestions: Link*
Python Problem-Solving Bootcamp
🚀 Solve 42 programming puzzles over the course of 21 days: Link*