How to disable logging from PyTorch-Lightning logger?
Asked Answered
R

4

9

Logger in PyTorch-Lightning prints information about the model to be trained (or evaluated) and the progress during the training,

However, in my case I would like to hide all messages from the logger in order not to flood the output in Jupyter Notebook.

I've looked into the API of the Trainer class on the official docs page https://pytorch-lightning.readthedocs.io/en/latest/common/trainer.html#trainer-flags and it seems like there is no option to turn off the messages from the logger.

There is a parameter log_every_n_steps which can be set to big value, but nevertheless, the logging result after each epoch is displayed.

How can one disable the logging?

Revisory answered 16/8, 2021 at 18:52 Comment(0)
I
7

I am assuming that two things are particularly bothering you in terms of flooding output stream:

One, The "weight summary":

  | Name | Type   | Params
--------------------------------
0 | l1   | Linear | 100 K 
1 | l2   | Linear | 1.3 K 
--------------------------------
...

Second, the progress bar:

Epoch 0:  74%|███████████   | 642/1874 [00:02<00:05, 233.59it/s, loss=0.85, v_num=wxln]

PyTorch Lightning provided very clear and elegant solutions for turning them off: Trainer(progress_bar_refresh_rate=0) for turning off progress bar and Trainer(weights_summary=None) for turning off weight summary.

Intermediacy answered 18/8, 2021 at 14:21 Comment(2)
progress_bar_refresh_rate seems to be deprecated now. It is recommended to use enable_progress_bar=False. Or add a TQDMProgressBar Callback and adapt it's refresh_rateDownfall
doesn't work anymore, no weights_summary flagLinguistics
A
0

Maybe try like that?

logging.getLogger("package").propagate = False
Arbitrary answered 16/8, 2021 at 19:23 Comment(1)
Thanks for the suggestion. However, the problem still remainsRevisory
R
0

The solution was the combination of the @Artyrm Sergeev suggestion and the answer suggested here https://mcmap.net/q/87632/-how-do-you-suppress-output-in-jupyter-running-ipython.

  1. Get all pytorch_lightning loggers:

    pl_loggers = [ logging.getLogger(name) for name in logging.root.manager.loggerDict if 'pytorch_lightning' in name ]

  2. Put the trainer.fit inside following construction:

    with io.capture_output() as captured: trainer.fit(...)

Revisory answered 17/8, 2021 at 6:39 Comment(0)
E
0

In newer versions model summary and progress reporting might be suppressed using these 2 trainer arguments

enable_progress_bar (Optional[bool]) – Whether to enable to progress bar by default. Default: True.

enable_model_summary (Optional[bool]) – Whether to enable model summarization by default. Default: True.

See the Trainer API for more details.

However I'm still wondering how to suppress these device usage logs:

GPU available: False, used: False
TPU available: False, using: 0 TPU cores
HPU available: False, using: 0 HPUs

---Update #1---

Turns out that, to disable the device availability messages above, you might want to use

import logging
logging.getLogger("lightning.pytorch.utilities.rank_zero").setLevel(logging.FATAL)
Example answered 13/8 at 13:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.