pytorch-lightning: LR finder broken

#614 πŸ› Bug

To Reproduce

Steps to reproduce the behavior:

model = TestModel()
trainer = pl.Trainer(gpus=1, default_save_path=exp_path, max_epochs=100)   

def configure_optimizers(self):
        optim = torch.optim.Adam(self.parameters(), lr=self.lr)
        sched = torch.optim.lr_scheduler.ReduceLROnPlateau(optim, 'min')
        return [optim], [sched]
# Run learning rate finder
lr_finder = trainer.lr_find(model)
# Results can be found in
lr_finder.results
# Plot with
fig = lr_finder.plot(suggest=True)
fig.show()

The following returns consistently:

 optimizer got an empty parameter list

The regular .fit method works as expected.

PL version: β€˜0.7.6’

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 15 (9 by maintainers)

Most upvoted comments

@Molaire Don’t use prepare_data and instead call fit and lr_find with a dataloader parameter that you’ve processed and initialized outside of the model (I actually like to put a static method in the model for it just to keep it tidy).