notebook: Large notebook fails to save

When I was using IPython notebook to analyze our experiment data, I noticed I could not save notebook. The console (from which I started ipython notebook) stated:

[I 17:37:11.736 NotebookApp] Malformed HTTP message from ::1: Content-Length too long

So I guess this problem comes from notebook size. I was using “bokeh” library to plot my data, and the notebook file was about 100 MB on disk.

To reproduce, I prepared a new notebook and did many plot to produce a large-filesize notebook.

2015-10-24 17 30 24 This does 30001-point plot repeatedly (e.g. 100 plots in above screen shot,) I could not save the notebook above: when I repeated saving with increasing number of plots, again above about 100MB, I could not save the notebook (with the same console message).

In a little more detail, I could save notebook until 88 plots, when notebook file size was 104756892 byte or 99.904 MB. And I could not save it with 89 plots. By increasing number of plots, file size increased about 1.1 MB per one plot.

I searched issue list, but could not find about this. Is this limit intentional? Is there some work around for this problem (without removing cells from notebook)?


My environments are:

$ python -c "import IPython; print(IPython.sys_info())"
{'commit_hash': u'2d95975',
 'commit_source': 'installation',
 'default_encoding': 'UTF-8',
 'ipython_path': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython',
 'ipython_version': '3.2.1',
 'os_name': 'posix',
 'platform': 'Darwin-13.4.0-x86_64-i386-64bit',
 'sys_executable': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python',
 'sys_platform': 'darwin',
 'sys_version': '2.7.10 (default, Aug 26 2015, 18:15:57) \n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)]'}

OS: Mac (OSX, 10.9.5 Mavericks) Browser: Safari 9.0 (9537.86.1.56.2) matplotlib python library ver. 1.4.3 numpy python library ver. 1.9.2 bokeh python library ver. 0.10.0

About this issue

  • Original URL
  • State: closed
  • Created 9 years ago
  • Reactions: 7
  • Comments: 27 (12 by maintainers)

Commits related to this issue

Most upvoted comments

@takluyver I think these default limits should suffice for now. Let’s close this and for reference, if any users are encountering this issue (not being able to save a notebook due to file size), you can increase the limit by editing these lines: https://github.com/jupyter/notebook/blob/master/notebook/notebookapp.py#L237-L238

Having the same problem here…

We temporarily fixed this by modifying the tornado/iostream.py file, as suggested before by @takluyver . For example, by doing self.max_buffer_size = 1048576000

@davidcortesortuno and I are also having this problem with Holoviews HoloMaps, where it’s quite easy to go over 100mb.