pytorch-lightning: [Trainer] flush_logs_every_n_steps not working

πŸ› Bug

HI all ! thanks a lot for this great module πŸ˜ƒ

Trainer has this neat argument flush_logs_every_n_steps, and does indeed take it into account before calling logger.save() in training_loop.py. Yet, the LoggerConnector flushes at each log_metrics call by calling self.trainer.logger.save().

Is that the expected behaviour of flush_logs_every_n_steps? Shouldn’t it be something like

if self.trainer.logger is not None:
    if self.trainer.is_global_zero:
        self.trainer.logger.agg_and_log_metrics(scalar_metrics, step=step)
        if self.should_flush_logs:
            self.trainer.logger.save()

in logger_connector.py ?

To Reproduce

Print when Logger.save is called.

Expected behavior

Only call logger.save() every n steps.

Environment

I’ve tried to reproduce the bug in Colab but I get an error as test_tube is not installed even though I did pip install test_tube .

  • CUDA:
    • GPU:
    • available: False
    • version: 10.2
  • Packages:
    • numpy: 1.19.4
    • pyTorch_debug: True
    • pyTorch_version: 1.7.0
    • pytorch-lightning: 1.0.6
    • tqdm: 4.51.0
  • System:
    • OS: Linux
    • architecture:
      • 64bit
    • processor: x86_64
    • python: 3.7.9
    • version: #1 SMP Tue Aug 11 16:36:14 UTC 2020

Cheers, Emile

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 18 (7 by maintainers)

Most upvoted comments

@emilemathieu Will look into it soon πŸ˜ƒ Thanks!

@emilemathieu logger.history is an attribute of a Custom Logger to test the calls to the log_metrics method. It’s just for the test. https://github.com/PyTorchLightning/pytorch-lightning/blob/baa8558cc0/tests/loggers/test_all.py#L77

Yeah, I understand why it could be confusing. logger.agg_and_log_metrics and logger.save, both are calling logger.log_metrics. logger.agg_and_log_metrics is calling it if it has metrics to log. Also, we would need to regard for validation and test modes.

I think it would be better to do a mini refactor than trying to fit a fix in the current flow.