vscode: Jupyter over remote ssh sometimes becomes slow and/or unresponsive

Environment data

  • VS Code version: 1.62.3 (Universal)
  • Jupyter Extension version (available under the Extensions sidebar): v2021.10.1101450599
  • Python Extension version (available under the Extensions sidebar): v2021.11.1422169775
  • OS (Windows | Mac | Linux distro) and version: Mac
  • Python and/or Anaconda version: 3.7.12
  • Type of virtual environment used (N/A | venv | virtualenv | conda | …): conda
  • Jupyter server running: Remote

Expected behaviour

I expect code cells to begin executing immediately and (for simple code snippets) to finish executing immediately (e.g. print(‘hello’))

Actual behaviour

Once the jupyter notebook starts to contain a non-trivial amount of stuff, I start to observe the following behaviors: (1) code cells that contain simple tasks (e.g. print('hello')) start taking several seconds or more to complete, (2) it may take several seconds or even minutes for vscode to even visually show that it will start executing a code cell, and (3) in severe cases, the whole vscode editor will become unresponsive and I will need to force shut it down.

Steps to reproduce:

Below is code for a jupyter notebook that contains a minimal example. The problems start to appear once you’ve cycled through the code cells 2 or 3 times. I think one of the main contributing factors to this issue are just notebooks that have a lot of “stuff” in them. To demonstrate that, the last code block prints 50 scatter plots as pngs, which the native jupyter lab server over the browser handles just fine… but vscode seems to have issues. Moreover, in this simple example I have observed instances where it starts to take a while to save the notebook and the editor may even become unresponsive.

Other things to note:

  • I am running this jupyter notebook on an AWS EC2 instance with ~90GB memory 48 CPUs (using a AWS linux ami: ami-083ac7c7ecf9bb9b0).
  • I have not yet observed these same issues when I have the kernel running on my local mac machine.
Jupyter example

# %%
# %load_ext autoreload
# %autoreload 2

import pandas as pd
import plotly.express as px
import numpy as np
import plotly.io as pio
pio.renderers.default='jupyterlab+notebook'
pd.set_option("display.max_columns", 20)

# %%
data = pd.DataFrame(np.random.random(size=(1000,1000)))

# %%
data

# %%
print('hello')

# %%
for i in range(50):
    fig = px.scatter(x=data[i], y=data[i+1])
    fig.show(renderer='png')
    

Logs

Here is a screenshot where it takes almost 3 seconds for a simple print statement. I’ve seen worse in some of my code related to real projects, though. This is just what I was able to reproduce with a minimal example.

Screen Shot 2021-11-22 at 4 48 27 PM

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 34
  • Comments: 43 (11 by maintainers)

Commits related to this issue

Most upvoted comments

experiencing similar issues when working on jupyter via ssh to an ubuntu server from an ubuntu pc

Also having this issue with jupyter over ssh, in a notebook that has a decent number of plots.

I also experience the same. I observed that:

  • It happens with Jupyter notebooks, but not in the Interactive window.
  • It does not happen if you connect to a Jupyter kernel that is not managed by VScode.
  • It happens when your notebooks has many figures.

I hope these may help diagnose the problem.

I pushed a change which should mostly fix this slowness with large outputs. I think there are more optimizations we could do. The issue comes when a save happens, and this blocks the remote extension host process for some period of time. Then a symptom of that might be that an execution appears to hang or take a long time.

Would appreciate anyone trying this out in tomorrow’s vscode insiders build.

If you’re seeing an issue related to the variables view, that wouldn’t be fixed here, however I recently pushed a different fix that may help with that, and you could try it out in insiders + the jupyter pre-release extension.

@BounharAbdelaziz Happy to help. I saw in one of the other tickets this bug has a synergy with notebook size, and since clearing all outputs reduces notebook size, this might be a temporary fix until the size increases eventually and we need to do it again. I would love to see this fixed as this bug is the biggest hindrance to me day-to-day vscode experience.

This issue happened with me also (especially when i am insider a docker container inside the remote machine)

Having the same exact issue and it renders the jupyter unresponsive: image And it’s still running. A temporary solution I found is to click on “Clear outputs for all cells” and more often than not it works smooth again. Only works if the cell stops executing though…

ALso had the issue where running a cell lags(or starts after some time) for jupyter notebooks when using remote SSH

  • aws
  • azure

Thanks for the feedback.

Please could you:

  • Install VS Code Insiders
  • Go into your settings and search for Jupyter logging and change the logging level for Jupyter extension to Verbose
  • Reload VS Code
  • Run the command Measure Extension Host Latency
    • An editor will be opened with some text such as Roundtrip latency...., please copy all of that & paste that into this issue
  • Try simple print statements (without any data frames, and figures, lets keep it very simple)
    • Please ensure the notebook is empty & does not contain any outputs only 1-2 cells with simple print statements.
  • Provide the logs from the Jupyter output panel (use the command Jupyter: View Output to get to the output panel)
  • Reload VS Code
  • Repeat what you did earlier steps you
  • Provide the logs from the Jupyter output panel (use the command Jupyter: View Output to get to the output panel)

Please note, I’ll need two logs, one without any data frames, no plots, just print statements. The other is the one with dataframes and the like.

@rebornix The improvements are noticeable, thank you for your great work! However, the latency is still there and appears to be significant from time to time.

I love this ssh-notebook function, and I believe there must be more folks like me. Hope your team can prioritize this issue.

Appreciate!

@bo44arov @paul-brenner We have added experimental saving logic for Remote SSH, it would be great if you can give this a try and see if it improves the performance on large notebooks

  • Install latest VS Code Insiders
  • Remote SSH into your remote machine
  • Add "notebook.experimental.remoteSave": true to your Remote/User settings
  • Reload the window
  • Then test the scenarios which used to slow or block the network

Thanks in advance!

Thanks for the logs. Transferring to VS Code due to connectivity issues with remtoe SSH

[2021-12-03 13:55:39.949] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] received socket timeout event.
[2021-12-03 13:55:39.991] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] starting reconnecting loop. You can get more information with the trace log level.
[2021-12-03 13:55:39.991] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] resolving connection...
[2021-12-03 13:55:39.992] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] connecting to 127.0.0.1:53902...
[2021-12-03 13:55:45.574] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] reconnected!
[2021-12-03 13:56:07.839] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] received socket timeout event.
[2021-12-03 13:56:07.881] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] starting reconnecting loop. You can get more information with the trace log level.
[2021-12-03 13:56:07.882] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] resolving connection...
[2021-12-03 13:56:07.882] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] connecting to 127.0.0.1:53902...
[2021-12-03 13:56:15.008] [renderer3] [info] [remote-connection][ExtensionHost][548fa…][reconnect] reconnected!