vscode: Jupyter over remote ssh sometimes becomes slow and/or unresponsive
Environment data
- VS Code version: 1.62.3 (Universal)
- Jupyter Extension version (available under the Extensions sidebar): v2021.10.1101450599
- Python Extension version (available under the Extensions sidebar): v2021.11.1422169775
- OS (Windows | Mac | Linux distro) and version: Mac
- Python and/or Anaconda version: 3.7.12
- Type of virtual environment used (N/A | venv | virtualenv | conda | …): conda
- Jupyter server running: Remote
Expected behaviour
I expect code cells to begin executing immediately and (for simple code snippets) to finish executing immediately (e.g. print(‘hello’))
Actual behaviour
Once the jupyter notebook starts to contain a non-trivial amount of stuff, I start to observe the following behaviors: (1) code cells that contain simple tasks (e.g. print('hello')
) start taking several seconds or more to complete, (2) it may take several seconds or even minutes for vscode to even visually show that it will start executing a code cell, and (3) in severe cases, the whole vscode editor will become unresponsive and I will need to force shut it down.
Steps to reproduce:
Below is code for a jupyter notebook that contains a minimal example. The problems start to appear once you’ve cycled through the code cells 2 or 3 times. I think one of the main contributing factors to this issue are just notebooks that have a lot of “stuff” in them. To demonstrate that, the last code block prints 50 scatter plots as pngs, which the native jupyter lab server over the browser handles just fine… but vscode seems to have issues. Moreover, in this simple example I have observed instances where it starts to take a while to save the notebook and the editor may even become unresponsive.
Other things to note:
- I am running this jupyter notebook on an AWS EC2 instance with ~90GB memory 48 CPUs (using a AWS linux ami:
ami-083ac7c7ecf9bb9b0
). - I have not yet observed these same issues when I have the kernel running on my local mac machine.
Jupyter example
# %%
# %load_ext autoreload
# %autoreload 2
import pandas as pd
import plotly.express as px
import numpy as np
import plotly.io as pio
pio.renderers.default='jupyterlab+notebook'
pd.set_option("display.max_columns", 20)
# %%
data = pd.DataFrame(np.random.random(size=(1000,1000)))
# %%
data
# %%
print('hello')
# %%
for i in range(50):
fig = px.scatter(x=data[i], y=data[i+1])
fig.show(renderer='png')
Logs
Here is a screenshot where it takes almost 3 seconds for a simple print statement. I’ve seen worse in some of my code related to real projects, though. This is just what I was able to reproduce with a minimal example.

About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 34
- Comments: 43 (11 by maintainers)
Commits related to this issue
- Fix uri transformer slowness over large buffers Fix #138784 — committed to microsoft/vscode by roblourens a year ago
- Fix uri transformer slowness over large buffers (#171126) Fix #138784 — committed to microsoft/vscode by roblourens a year ago
experiencing similar issues when working on jupyter via ssh to an ubuntu server from an ubuntu pc
Also having this issue with jupyter over ssh, in a notebook that has a decent number of plots.
I also experience the same. I observed that:
I hope these may help diagnose the problem.
I pushed a change which should mostly fix this slowness with large outputs. I think there are more optimizations we could do. The issue comes when a save happens, and this blocks the remote extension host process for some period of time. Then a symptom of that might be that an execution appears to hang or take a long time.
Would appreciate anyone trying this out in tomorrow’s vscode insiders build.
If you’re seeing an issue related to the variables view, that wouldn’t be fixed here, however I recently pushed a different fix that may help with that, and you could try it out in insiders + the jupyter pre-release extension.
@BounharAbdelaziz Happy to help. I saw in one of the other tickets this bug has a synergy with notebook size, and since clearing all outputs reduces notebook size, this might be a temporary fix until the size increases eventually and we need to do it again. I would love to see this fixed as this bug is the biggest hindrance to me day-to-day vscode experience.
This issue happened with me also (especially when i am insider a docker container inside the remote machine)
Having the same exact issue and it renders the jupyter unresponsive:
And it’s still running.
A temporary solution I found is to click on “Clear outputs for all cells” and more often than not it works smooth again. Only works if the cell stops executing though…
ALso had the issue where running a cell lags(or starts after some time) for jupyter notebooks when using remote SSH
Thanks for the feedback.
Please could you:
Jupyter logging
and change the logging level for Jupyter extension toVerbose
Measure Extension Host Latency
Roundtrip latency....
, please copy all of that & paste that into this issueJupyter
output panel (use the commandJupyter: View Output
to get to the output panel)Jupyter
output panel (use the commandJupyter: View Output
to get to the output panel)Please note, I’ll need two logs, one without any data frames, no plots, just print statements. The other is the one with dataframes and the like.
@rebornix The improvements are noticeable, thank you for your great work! However, the latency is still there and appears to be significant from time to time.
I love this ssh-notebook function, and I believe there must be more folks like me. Hope your team can prioritize this issue.
Appreciate!
@bo44arov @paul-brenner We have added experimental saving logic for Remote SSH, it would be great if you can give this a try and see if it improves the performance on large notebooks
"notebook.experimental.remoteSave": true
to your Remote/User settingsThanks in advance!
Thanks for the logs. Transferring to VS Code due to connectivity issues with remtoe SSH