vscode-jupyter: VS Code crashes when running a cell that produces a lot of output

I am trying to print model summary in tensorflow and I think the model is large and it’s crashing my notebook. The model is ResNet101.

image

The whole computer comes to a halt, memory usage goes up to 99% and VS Code crashes. I have 16 GB of ram, so I didn’t think printing something large would actually eat all my ram. Also, because the kernel crashes, all the variables are lost like history = model.fit() which I need to fine-tune the model afterwards. Moreover, I need to print base_model summary in order to choose from which layer to fine-tune from.

Is there a way to print the summary in another way and save the entire notebook with the variables, so I can continue working? I have checkpoints for model weights, but I need to keep track of past epochs through history to resume training afterwards.

I will try using.py files, but I want to know if there is a way to solve this problem for jupyter.

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 2
  • Comments: 60 (24 by maintainers)

Commits related to this issue

Most upvoted comments

Hi everyone! I’ve run into the white walls of doom just after a recent vscode update, on my Mac M1 Pro. Due to compliance reasons i’m unable to provide code examples, but some of the issues others have reported seem to keep popping up:

  • It happens when running exhaustive calculations/displaying long outputs
  • Other notebooks that previously ran fine seem now to be “corrupted”, or at least this is a persistent issue even when rebooting
  • I’ve purged vscode and reinstalled everything, to no avail

I hope this information is at least a little bit useful!

@livan3li
Unfortunately this hasn’t been fixed yet please could you share this notebook I’d like to try this at my end

Also, please could your disable all extensions, including Jupyter and see if you run into this issue If you do then the problem is different, as this issue focuses on crashes doe to running of notebooks However I do want to address your issue too, but would like to deal with that separately, for now let’s chat here

@mariosconsta @codeananda @dokutoshi I’m sorry you are still running into this issue.

  • Assume you save the notebook and open it again in VS Code, does VS Code then crash again?
  • Assume you run this notebook in regular Jupyter Notebook/Lab, and then open the notebook in VS Code, does it crash again

Basically trying to figure out whether its caused by

  • The regular streaming of data
  • Or because the output in a cell is too large

Please could you help us repro this issue

  • Could you perhaps create a docker file with the necessary dependencies with instruction so we can test this at our end
  • Please remember to include the same code (notebook) with instructions for installing all dependencies

just tried to do a big calculation in a notebook and the White Walls of Doom™ came back. It seems like VS Code + Notebooks + Big Calculations = Bad Idea.

@codeananda Is this in docker or WSL. Please could you share some sample code

Basically I’m looking for some sample code that would generate such large content to cause this issue at our end

The issue remains in the latest VScode. I am running a long Resnet mode to a remote server using VScode. Vscode crashed twice. VScode Jupyter is a white wall as described earlier. Just to remind you, I am not running the code locally but remotely. Image 2-27-23 at 10 31 PM

hi everyone which, but this should be fixed in the next release (we were unable to take last month).

you can try installing the latest pre-release version of the jupyter extension and that should work.

Currently the extension host seems to crash due to the size of the output generated.

@DonJayamanne by the way, if you saw the recording in the demo2.zip, I printed only the layers of the model using a for loop, which is a very long list of text, and it runs instantly.

I will test it out first thing in the morning!

Thanks for your patience & I’m sorry about this issue, hopefully we can get to the bottom of this.

I’m assuming you are referring to the cell with the code base_model.summary Is it possible to try the following:

  • Not print any output at all
  • Run the first cell, then clear all output using the command or icon in the toolbar
  • Run the next cell and clear all of the output.
  • Similarly continue till you get to the problematic cell, again, comment out the code where you are printing the model summary.

I have a suspicion the output from the second cell is very large & that could be chewing up resources. Again, please ensure you clera all of the output after running each cell.

What I’d like to figure out is:

  • Whether the memory usaage/crash is realted to the output or execution
  • If we ensure the output is always cleared, and things work as expected (i.e. it does not crash), then we’ve narrowed down the issue to outputs causing increased memory usage and crashes vscode.

Finally

  • Also, please could you share the output from the Jupyter output panel, use the command Jupyter: Show Output Run the first cell in the notebook and copy the logs and send the logs (please ensure you send the whole log). Shows informatino about versions, python used, and the like.
  • Are you running the code against a local or remote kernel?
  • memory usage goes up to 99% and VS Code crashes. I Please could you take a screenshot of this and upload that here, would like to check what vscode process is chewing up the memory. thanks

Thanks for filing this issue and I’m sorry you are running into this.

  • Is it VS Code or is it is the kernel that crashes?
  • If you were to run this outside VS Code in classic jupyter via a terminal , do you experience the same problem?
  • Can you run a cell at a time to check when exactly vscode crashes, that would help narrow down the issue