vscode-python: Debugging pyspark applications no longer works after August update

Environment data

  • VS Code version: 1.27
  • Extension version (available under the Extensions sidebar): 2018.8.0
  • OS and version: Linux Mint 18.1 x64
  • Python version (& distribution if applicable, e.g. Anaconda): 2.7.12
  • Type of virtual environment used (N/A | venv | virtualenv | conda | …): N/A
  • Relevant/affected Python packages and their versions: Pyspark 2.21

Actual behavior

Debugging pyspark application doesn’t work after updating to 2018.8.0. After starting the debugger, the terminal shows the following command and error

cd /home/user/etl ; env "PYSPARK_PYTHON=python" "PYTHONPATH=/home/user/etl:/home/user/.vscode/extensions/ms-python.python-2018.8.0/pythonFiles/experimental/ptvsd" "PYTHONIOENCODING=UTF-8" "PYTHONUNBUFFERED=1" /home/user/Spark/spark-2.2.1-bin-hadoop2.7/bin/spark-submit -m ptvsd --host localhost --port 39763 /home/user/etl/etl/jobs/process_data.py 

Error: Unrecognized option: -m

Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]

Expected behavior

With version 2018.7.0, pyspark debugging works fine. The following command is displayed in the terminal after starting the debugger

cd /home/user/etl ; env "PYSPARK_PYTHON=python" "PYTHONPATH=/home/user/etl" "PYTHONIOENCODING=UTF-8" "PYTHONUNBUFFERED=1" /home/user/Spark/spark-2.2.1-bin-hadoop2.7/bin/spark-submit /home/user/.vscode/extensions/ms-python.python-2018.7.0/pythonFiles/PythonTools/visualstudio_py_launcher.py /home/user/etl 46508 34806ad9-833a-4524-8cd6-18ca4aa74f14 RedirectOutput,RedirectOutput /home/user/etl/etl/jobs/process_data.py

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 20 (3 by maintainers)

Commits related to this issue

Most upvoted comments

I’ll have a fix today.

I ran into this issue today. I can confirm that the fix worked for me. Thanks