mlflow: Unable to run mlflow behind jupyter-server-proxy

System information

Describe the problem

In a hosted environment such as Amazon SageMaker or JupyterHub, a user might want to run mlflow behind the popular jupyter-server-proxy extension. This allows them to use Jupyter alongside mlflow on the same host.

Unfortunately, the way this works is that mlflow is proxied to a path such as http://myjupyter-server.com/proxy/5000 instead of http://127.0.0.1:5000. Because there are a lot of references to paths that don’t exist, the UI doesn’t load.

I was almost able to get mlflow up and running, I get as far as the “Oops! Something went wrong.” page, but nothing beyond that.

Source code / logs

  1. Create a SageMaker Notebook Instance.
  2. !pip install mlflow
  3. !mlflow
  4. Open http://mynotebook.notebook.region.sagemaker.aws/proxy/5000/
  5. “Oops! Something went wrong.” (Niagra falls.)

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 5
  • Comments: 18 (3 by maintainers)

Commits related to this issue

Most upvoted comments

+1

Nope, have a work around for it. Download the generated files (zipped) in local and run mlflow in local

It should be fixed by removing the leading / from the JavaScript calls. To fix this, I’m sadly missing documentation on how mlflow/server/js/src/sdk/MlflowService.js is generated. This is not covered by generate-protos.sh.

The reason why the website fails, is that there are api calls in the javascript code which are executedon the client side. The definition of these api calls are in mlflow/server/js/src/sdk/MlflowService.js - which is automatically generated based on the proto definitions.

When you load the website on the client side, the api endpoints (/ajax-api/…) in the javascript code are combined with the hostname and the port number, but not the path of the url.

I haven’t run it on sagemaker, and without seeing the logs from the client side, it’s hard to tell if it’s the same issue. If it is the same issue, you can change the api path in the source code of mlflow and then compile a custom mlflow package. In regards to the port number, maybe you could run mlflow server on port 80.

If you’re running mlflow behind an nginx reverse proxy you can a filter to update the path in the javascript code that is sent to the client when serving the files:

location /mlflow/ {
    proxy_pass http://localhost:5000/;
    sub_filter '/ajax-api' '/mlflow/ajax-api';
    sub_filter_once off;
    sub_filter_types *;
}