pytorch-lightning: LR finder broken
#614 π Bug
To Reproduce
Steps to reproduce the behavior:
model = TestModel()
trainer = pl.Trainer(gpus=1, default_save_path=exp_path, max_epochs=100)
def configure_optimizers(self):
optim = torch.optim.Adam(self.parameters(), lr=self.lr)
sched = torch.optim.lr_scheduler.ReduceLROnPlateau(optim, 'min')
return [optim], [sched]
# Run learning rate finder
lr_finder = trainer.lr_find(model)
# Results can be found in
lr_finder.results
# Plot with
fig = lr_finder.plot(suggest=True)
fig.show()
The following returns consistently:
optimizer got an empty parameter list
The regular .fit method works as expected.
PL version: β0.7.6β
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 15 (9 by maintainers)
@Molaire Donβt use
prepare_dataand instead callfitandlr_findwith a dataloader parameter that youβve processed and initialized outside of the model (I actually like to put a static method in the model for it just to keep it tidy).