jupyter: Output do not update after close and reopen the notebook page

Hi,

I am using a library named keras with jupyter notebook. It’s nice that keras use a text progress bar to indicate the progress like this:

Train on 381734 samples, validate on 20092 samples
Epoch 1/1
381734/381734 [==============================] - 257s - loss: 0.4094 - val_loss: 0.2863

But the problem is that, if the script is running with the progress bar, I can’t close the webpage which I used to start the script. Otherwise I won’t see the progress bar update anymore, when I open the notebook again, if it’s still running, the title will become something like “(Starting) XXXXXX”, but I can’t see any update in the page.

The progress bar used by keras is like this: https://github.com/fchollet/keras/blob/b126b6328a44fa3332d2d7fd011da3ff196a669a/keras/utils/generic_utils.py

Can anyone look into this?

Thanks in advance.

About this issue

  • Original URL
  • State: closed
  • Created 9 years ago
  • Reactions: 42
  • Comments: 93 (14 by maintainers)

Most upvoted comments

For anyone looking for a quick workaround:

  • Open the Chrome dev tools
  • Go to network tab
  • Click WS (Websockets)
  • Click on any of the active websockets
  • Go to messages tab
  • Click on any of the messages after the first
  • Expand content You should see your output there

I am facing the same issue. Is it possible to reopen a notebook and get the output of the running kernel ?

Unfortunately no, the cell IDs change whenever the page is refreshed so the mapping between kernel messages and cells is lost. I don’t have time to work on this at the moment but I think it’s an important feature and I will support anyone that wants to take it on.

please. add. this. feature.

My app has been running for hours and I can’t tell how far it is 😕

It’s been more than 4 years since this issue has been reported, yet still no update.
Although I feel like it should be treated as must-have core feature of notebook rather than nice-to-have.

Closing this issue as it’s been inactive for several months. Please feel free to open a new issue at the jupyter/help repo or reopen this issue. Thanks! 🌻

The issue is still here

Any update on this issue? It is impractical to operate this way on remote platforms without output updates after closing a notebook. For example Google Cloud engines runs Jupyter notebooks and the connection gets lost from time to time, but the kernel continues operating and without progress updates.

Could somebody mark this issue as a priority bug? It should really be resolved…

I have also the same problem, it’s annoying if we can’t see the progress anymore after we close the tab or disconnected for a while, especially during the training which could run for many hours or days. So, fix to this bug would be really very helpful. Thanks.

Same problem

I have the same issue too. Reconnecting kernel doesn’t help me either.

This problem is still unsolved 😕 It’s difficult to use jupyter with remote environments, having to keep it open all the time.

Still an issue! This is critical when working with remote environments like AWS with long running cells, for example, for ML.

Same issue. It would be good if this issue can be fixed.

I was running not Keras but numpy code. I closed a notebook tab and opened it again. However, it’s not showing updated output while its kernel is still running. Note that I was running Jupyter in client-server setting.

My environment is the below.

  • Jupyter: 4.2.0
  • Browser: Google Chrome 53.0.2785.154
  • Host OS: Ubuntu 16.04.1 LTS (Xenial Xerus)
  • Client OS: Chrome OS

The early part of my configuration file ~/.jupyter/jupyter_notebook_config.py:

# Configuration file for jupyter-notebook.

#------------------------------------------------------------------------------
# Application(SingletonConfigurable) configuration
#------------------------------------------------------------------------------

## This is an application.

c = get_config()
c.NotebookApp.open_browser = False
c.NotebookApp.password = 'sha1:6cea77f:673802e58d129779298e501a2bc40'
c.NotebookApp.certfile = '/home/raviqqe/mycert.pem'
c.NotebookApp.keyfile = '/home/raviqqe/mykey.key'
c.NotebookApp.ip = '*'
c.NotebookApp.port = 8888

## The date format used by logging formatters for %(asctime)s
#c.Application.log_datefmt = '%Y-%m-%d %H:%M:%S'
...

I’m not sure that this problem is caused by Jupyter or Chrome.

Same problem

PLEASE CREATE A SOLUTION FOR THIS!!!

@sambbhavgarg I’m using PyTorch – by chance are you aware of an equivalent there? My simple workaround is to setup a python logger that logs to stdout and to a file. However, I lose out on the ability to display images inline in the notebook.

We need this feature!!!

Same issue. It’s really frustrating whenever the notebook closes or disconnected, the cells are not updated anymore. Could you please add this feature soon!

At least for nteract and jupyterlab, that seems like a reasonable thing to store in the localStorage since they both have cell IDs.

7 years lol

same issue 7 years later, what a shame

I kind of love the occasional messages I get from this thread though

I’ve been working on a platform to solve exactly this, called Noteable. You can try it yourself at https://app.noteable.io/

image

The notebook outputs are updated even if your tabs are closed. You can also open multiple tabs and work on the same document with yourself or others. No worries about clobbering your work and you can let long-running jobs run. Reach out if you’d like to get wider support at your lab or organization.

Hopefully someday we can bring back this tech into Jupyter open source.

There’s a simple workaround using ipython %%capture magic.

%%capture cap
#Whatever you want to run goes here

#save it to a file
with open('cell_output.txt', 'w') as f:
    f.write(cap.stdout)

When you re-open your notebook you can either load the text file or just print the variable you saved output to:

Not ideal since it doesn’t display the contents as it runs, but better than nothing I suppose.

I’d like to rephrase this question to how I can redirect the outputs of Jupyter Notebook to a file?

For anyone looking for a quick workaround:

* Open the Chrome dev tools

* Go to network tab

* Click `WS` (Websockets)

* Click on any of the active websockets

* Go to messages tab

* Click on any of the messages after the first

* Expand content
  You should see your output there

Do you have similar solution for firefox?

My guess: When a notebook is closed and reopened, the kernel may continue running but the cells’ parent header values change. Therefore the notebook reconnects to the kernel and resumes receiving messages from the kernel but the cell outputs don’t update because they don’t recognize the parent headers in these messages.

@rgbkrk Do you concur?

The simple test case:

import time

for i in range(101):
    time.sleep(1)
    print(i)

Then reload the notebook and if you’re logging messages, you’ll see that the stream messages continue to come in but they’re not being appended to any cell’s output.

What I observe:

  • The cell ids change after reloading or closing/reopening (e.g. Jupyter.notebook.get_cells())
  • The msg.parent_header.msg_id is consistent for stream messages originating from the same cell, so even after the notebook is reloaded or closed/reopened, the parent_id is the same

I’m not sure where in the source messages are mapped to cells outputs (and specifically where the parent_id is used to do that), but I assume it has something to do with the cell’s id which changes after a reload or close/reopen.

For anyone looking for a quick workaround:

  • Open the Chrome dev tools
  • Go to network tab
  • Click WS (Websockets)
  • Click on any of the active websockets
  • Go to messages tab
  • Click on any of the messages after the first
  • Expand content You should see your output there

another beautified output workaround . Use workaround above with this script in chrome dev tool console

//copy current websocket and change session_id by padding ...-2
const socket = new WebSocket('ws://jupyterdomain.com/api/kernels/c0d04b6d-4a6b-4e21-9e5a-52d4cd7c93f7/channels?session_id=a6ac43c6-4211-4699-b1da-43af501df1b1-2');

// Listen for messages
socket.addEventListener('message', (event) => {
    //console.log('Message from server ', event.data)
    console.log(JSON.parse(event.data).content.text)
});

We are in 2022, the problem is still unsolved 😦

schrodinger’s kernel lmao

Whether the process is running as intended or not

Possible workaround for people using Keras on jupyter notebook, save your model weights using keras callbacks after every N epochs… If by any chance you refresh the window, wait for the saved weights to update, and load freshly saved models, compile and start training again.

checkpoint_filepath = 'ckpts'
fitsize = int(batch_size*((len(X_train)*0.8)//batch_size))
model_checkpoint_callback = ModelCheckpoint(
    filepath=checkpoint_filepath,
    save_weights_only=True,
    monitor='val_acc',
    mode='max',
    verbose=1,
    save_best_only=True)
history = model.fit(X_train[:fitsize], 
                    y_train[:fitsize], 
                    batch_size=batch_size, 
                    epochs=50, 
                    verbose=1, 
                    validation_data=(X_train[fitsize:], 
                                     y_train[fitsize:]),
                   callbacks=[model_checkpoint_callback])

Thank you Great job!!!

Em ter, 28 de abr de 2020 17:03, Matt Maciejewski notifications@github.com escreveu:

Same issue here. Very unfortunate for long-running calculations.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/jupyter/jupyter/issues/83#issuecomment-620824721, or unsubscribe https://github.com/notifications/unsubscribe-auth/AO6FSZTYCSSOHP2OWN2AAKDRO4ZBXANCNFSM4BWYTPRA .

@gnestor, this is sad. Could you please expand on some specific parts to refactor?

Exists in Jupyter Lab. Makes working with it really hard. The simple way out is to transfer the code into a Python file and to execute it on the shell.

I also found a workaround solution. Just don’t refresh the browser tab. If you’re using Jupyter lab, there are multiple panes, just close the pane for that notebook, and then on the left side, click to reopen the notebook again.

You can also use weights and biases for logging.

Just use the following jupyter nbconvert <notebook> --to ipynb --execute --debug in a terminal with tmux active. This will run in the background and plot everything you have in the notebook. The downside is that the notebook is closed when it’s done but the good side is you get the plots and metrics from your notebook and if you write it correctly you get the checkpoints from the model.

Annoying issue but who knows when it will be fixed.

here to keep the issue alive 😃. same problem. guess they will never fix it lol.

Another workaround can be to create a custom callback for keras which saves your metrics and losses to a CSV file, in that way, the CSV will keep updating after every epoch as the model trains.

That’s what Im doing now!

Possible workaround for people using Keras on jupyter notebook, save your model weights using keras callbacks after every N epochs… If by any chance you refresh the window, wait for the saved weights to update, and load freshly saved models, compile and start training again.

checkpoint_filepath = 'ckpts'
fitsize = int(batch_size*((len(X_train)*0.8)//batch_size))
model_checkpoint_callback = ModelCheckpoint(
    filepath=checkpoint_filepath,
    save_weights_only=True,
    monitor='val_acc',
    mode='max',
    verbose=1,
    save_best_only=True)
history = model.fit(X_train[:fitsize], 
                    y_train[:fitsize], 
                    batch_size=batch_size, 
                    epochs=50, 
                    verbose=1, 
                    validation_data=(X_train[fitsize:], 
                                     y_train[fitsize:]),
                   callbacks=[model_checkpoint_callback])

This might ruin your random states though, and botch the experiment in the process.

Another workaround can be to create a custom callback for keras which saves your metrics and losses to a CSV file, in that way, the CSV will keep updating after every epoch as the model trains.

I think we have to wait for https://github.com/jupyterlab/rtc/ to fix this issue.

I didn’t have this problem before (feb 2019) but now I have it somehow. Training for 2 days and no way to see the current status…