pytorch-lightning: Error in `load_from_checkpoint` when LightningModule init contains 'hparams'

❓ Questions and Help

What is your question?

Just pulled master today, and load_from_checkpoint no longer works. I am wondering if this is a backwards compatibility issue, or I need to do something differently to load now? I’ve provided the HEAD commits of my local repo I’m installing from in the environment section below.

Error in loading

Here’s the error I’m getting now without changing any code:

File "/home/chirag/miniconda3/envs/ml/lib/python3.8/site-packages/pytorch_lightning/core/saving.py", line 153, in load_from_checkpoint
    model = cls._load_model_state(checkpoint, strict=strict, **kwargs)
  File "/home/chirag/miniconda3/envs/ml/lib/python3.8/site-packages/pytorch_lightning/core/saving.py", line 193, in _load_model_state
    model = cls(**_cls_kwargs)
TypeError: __init__() missing 1 required positional argument: 'hparams'

The lightning module has always had a member variable hparams, so everything should have been saved (and indeed worked before this pull) in accordance with the docs on stable: https://pytorch-lightning.readthedocs.io/en/stable/hyperparameters.html#lightningmodule-hyperparameters

What’s your environment?

❯ git rev-parse --short HEAD                                             
1d3c7dc8    # DOES NOT WORK

❯ git rev-parse --short HEAD@{1}                                          
90929fa4    # WORKS OKAY

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 2
  • Comments: 27 (25 by maintainers)

Most upvoted comments

Aaah okay makes sense. Using your recommended method avoids the offending code altogether here:

https://github.com/PyTorchLightning/pytorch-lightning/blob/5c1eff351b035db5881d0cff81b1d9c9e150e2d0/pytorch_lightning/core/lightning.py#L1631-L1636

So it makes sense that it works. However, the code above doesn’t (and wouldn’t) work for the setup in the reproducible example. If the recommended way is to explicitly save hyperparameters, then assigning to .hparams should altogether be dropped in my opinion, and this code above be removed to avoid unnecessary code bloat.

I want to point out the following in your reproducible code that you posted:

class LitModel(pl.LightningModule):
    def __init__(self, hparams, another_param):
        super().__init__()

        # if isinstance(hparams, dict):
        #     hparams = Namespace(**hparams)

        # this is recommended Lightning way
        # it saves everything passed into __init__
        # and allows you to access it as self.myparam1, self.myparam2
        self.save_hyperparameters()

        # this is optional
        # only needed if you want to access hyperparameters via
        # self.hparams.myparam
        self.hparams = hparams

        # this is optional, if you call self.save_hyperparameters(), it's done for you
        self.another_param = another_param
        
        self.l1 = torch.nn.Linear(28 * 28, 10)

with save_hyperparameters(), this works smoothly,