Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #702 +/- ##
=======================================
Coverage 64.17% 64.17%
=======================================
Files 61 61
Lines 5892 5892
=======================================
Hits 3781 3781
Misses 2111 2111 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Click here to view all benchmarks. |
There was a problem hiding this comment.
I would recommend reverting the changes to this notebook.
| @@ -0,0 +1,290 @@ | |||
| { | |||
There was a problem hiding this comment.
"Using Learning Rate Schedulers" as the title - avoid the "LR".
In the first sentence, let's go with "... configure simple learning rate (LR) schedulers with ..."
I would recommend putting the last sentence in a "Note" comment toward the end of the notebook to avoid making the user wonder about it before they even get to the notebook.
Reply via ReviewNB
| @@ -0,0 +1,290 @@ | |||
| { | |||
There was a problem hiding this comment.
Second line: spelling error, "CIFAR10 dataset". Cut the "For more information" part.
Reply via ReviewNB
| @@ -0,0 +1,290 @@ | |||
| { | |||
There was a problem hiding this comment.
Instead of "Assuming everything is working properly", let's go with "By using the default learning rate scheduler, the resulting ..."
Reply via ReviewNB
| @@ -0,0 +1,290 @@ | |||
| { | |||
There was a problem hiding this comment.
Would it be worth doing this with a different scheduler? CosineAnnealing is popular https://docs.pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.CosineAnnealingLR.html
Alternatively, perhaps we could do this in two sections.
1) Modifying parameters for the default scheduler
2) Defining a different scheduler
Reply via ReviewNB
|
Also please add the new notebook to the |
| @@ -0,0 +1,400 @@ | |||
| { | |||
There was a problem hiding this comment.
Still worth changing the heading for this section I think. Perhaps "Specify Learning Rate Parameters"?
Reply via ReviewNB
| @@ -0,0 +1,400 @@ | |||
| { | |||
There was a problem hiding this comment.
Maybe a markdown section after this like:
"Now that we've updated the parameters, we'll retrain and infer with the model."
Reply via ReviewNB
| @@ -0,0 +1,400 @@ | |||
| { | |||
There was a problem hiding this comment.
I don't want to be overly picky, but can we add the regex "train" so that we only see the training results in the list? Also, I think that "Ignore outliers" might be selected, since you can't see the last two epochs for the gamma=0.9 run. Might be worth changing the colors to be more distinct, and calling out that orange is the default and red is the gamma=0.9 run.
I really like the way that you have the two training runs separated here. Definitely keep that. In the second Tensorboard screen shot, let's layer on the new CosineAnnealing as well. Similar to above, call out in a comment what run each color corresponds to.
Reply via ReviewNB
| @@ -0,0 +1,400 @@ | |||
| { | |||
There was a problem hiding this comment.
Hyrax offers support for more than just the ExponentialLR, here we demonstrate this with CosineAnnealingLR. Any other PyTorch learning rate scheduler can be used in the same way. The list of other schedulers can be seen here: https://docs.pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.LRScheduler.html.
Reply via ReviewNB
| @@ -0,0 +1,400 @@ | |||
| { | |||
There was a problem hiding this comment.
In the same way that you called out that the model performance was worse with gamma=0.9, you could do so here. Noting that the goal is to demonstrate the use of different learning rate schedulers more than to tune the model for optimal performance.
Reply via ReviewNB
Adds an notebook demonstrating the use of an LR scheduler