vscode-jupyter: VS Code crashes when running a cell that produces a lot of output
I am trying to print model summary in tensorflow and I think the model is large and it’s crashing my notebook. The model is ResNet101.
The whole computer comes to a halt, memory usage goes up to 99% and VS Code crashes. I have 16 GB of ram, so I didn’t think printing something large would actually eat all my ram. Also, because the kernel crashes, all the variables are lost like history = model.fit() which I need to fine-tune the model afterwards. Moreover, I need to print base_model summary in order to choose from which layer to fine-tune from.
Is there a way to print the summary in another way and save the entire notebook with the variables, so I can continue working? I have checkpoints for model weights, but I need to keep track of past epochs through history to resume training afterwards.
I will try using.py
files, but I want to know if there is a way to solve this problem for jupyter.
About this issue
- Original URL
- State: closed
- Created 2 years ago
- Reactions: 2
- Comments: 60 (24 by maintainers)
Commits related to this issue
- Speed an transfer of outputs to notebook webviews For https://github.com/microsoft/vscode-jupyter/issues/11031 The VS Buffer for the output items has a short byte length but a very large backing buf... — committed to mjbvz/vscode by mjbvz a year ago
- Speed up transfer of outputs to notebook webviews (#178719) Speed an transfer of outputs to notebook webviews For https://github.com/microsoft/vscode-jupyter/issues/11031 The VS Buffer for the ... — committed to microsoft/vscode by mjbvz a year ago
Hi everyone! I’ve run into the white walls of doom just after a recent vscode update, on my Mac M1 Pro. Due to compliance reasons i’m unable to provide code examples, but some of the issues others have reported seem to keep popping up:
I hope this information is at least a little bit useful!
@livan3li
Unfortunately this hasn’t been fixed yet please could you share this notebook I’d like to try this at my end
Also, please could your disable all extensions, including Jupyter and see if you run into this issue If you do then the problem is different, as this issue focuses on crashes doe to running of notebooks However I do want to address your issue too, but would like to deal with that separately, for now let’s chat here
@mariosconsta @codeananda @dokutoshi I’m sorry you are still running into this issue.
Basically trying to figure out whether its caused by
Please could you help us repro this issue
@codeananda Is this in docker or WSL. Please could you share some sample code
Basically I’m looking for some sample code that would generate such large content to cause this issue at our end
The issue remains in the latest VScode. I am running a long Resnet mode to a remote server using VScode. Vscode crashed twice. VScode Jupyter is a white wall as described earlier. Just to remind you, I am not running the code locally but remotely.
hi everyone which, but this should be fixed in the next release (we were unable to take last month).
you can try installing the latest pre-release version of the jupyter extension and that should work.
@DonJayamanne by the way, if you saw the recording in the demo2.zip, I printed only the layers of the model using a for loop, which is a very long list of text, and it runs instantly.
I will test it out first thing in the morning!
Thanks for your patience & I’m sorry about this issue, hopefully we can get to the bottom of this.
I’m assuming you are referring to the cell with the code
base_model.summary
Is it possible to try the following:I have a suspicion the output from the second cell is very large & that could be chewing up resources. Again, please ensure you clera all of the output after running each cell.
What I’d like to figure out is:
Finally
Jupyter
output panel, use the commandJupyter: Show Output
Run the first cell in the notebook and copy the logs and send the logs (please ensure you send the whole log). Shows informatino about versions, python used, and the like.Thanks for filing this issue and I’m sorry you are running into this.