jupyter notebook takes forever to open and then pages unresponsive - [MathJax] issue
Asked Answered
P

6

39

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:

Loading [MathJax]/extensions/safe.js

Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.

Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.

I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.

I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.

Peduncle answered 5/2, 2018 at 3:34 Comment(0)
P
32

I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!

If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout

Peduncle answered 5/2, 2018 at 17:15 Comment(6)
Confirmed this was the problem for me too - I had to use nbstripout to clear output directly from the .ipynb file via command line because I couldn't open it in Jupyter to 'Clear Output' in the first place.Kozlowski
Restarting kernel didn't work for me, (nor did removing the math.jax cookie). I was about to try to open the script in firefox to see what errorr might be raised as suggested in this thread, oddly, just opening firefox on it's own (without entering the url) seemed to cause my ipynb to load in chrome.Quality
profhoff, I downvoted because restarting the kernel is only available on a working notebook! "nbstripout" is not part of jupyter but is available on PyPI (and through pip).Leopold
For me, I am experiencing serious slowness when working with Jupyter Notebook in Chrome. For instance, Edge browser is fine. Maybe I have too much bookmarks and history in Chrome?!Wellfound
I confirm that clearing output from the menu worked for an extreme 0.5 GB notebook produced by some stress-tests (reduced it to 200kB, no nbstripout was needed). Make sure you save it before closing, or you're back to square one:)Softhearted
You just saved me.Anceline
T
37
  1. conda install -c conda-forge nbstripout

  2. nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.

Trehalose answered 2/1, 2021 at 18:43 Comment(3)
Make sure that there are NO white spaces in the filename.Trehalose
Just for clarity, the nbstripout filename.ipynb command must be typed at the anaconda prompt. And it cannot have spaces in the filename as pointed by @Anas.Beachlamar
what does this solution do?Peluso
P
32

I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!

If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout

Peduncle answered 5/2, 2018 at 17:15 Comment(6)
Confirmed this was the problem for me too - I had to use nbstripout to clear output directly from the .ipynb file via command line because I couldn't open it in Jupyter to 'Clear Output' in the first place.Kozlowski
Restarting kernel didn't work for me, (nor did removing the math.jax cookie). I was about to try to open the script in firefox to see what errorr might be raised as suggested in this thread, oddly, just opening firefox on it's own (without entering the url) seemed to cause my ipynb to load in chrome.Quality
profhoff, I downvoted because restarting the kernel is only available on a working notebook! "nbstripout" is not part of jupyter but is available on PyPI (and through pip).Leopold
For me, I am experiencing serious slowness when working with Jupyter Notebook in Chrome. For instance, Edge browser is fine. Maybe I have too much bookmarks and history in Chrome?!Wellfound
I confirm that clearing output from the menu worked for an extreme 0.5 GB notebook produced by some stress-tests (reduced it to 200kB, no nbstripout was needed). Make sure you save it before closing, or you're back to square one:)Softhearted
You just saved me.Anceline
B
13

I was having the same problem with jupyter notebook. My recommendations to you are as follows:

First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.

For example; In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set. The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.

I think this will solve your problem.

Bedraggled answered 10/12, 2020 at 16:27 Comment(1)
Your first recommendation solved the issue for me. Indeed a had two cells where I ran functions with high verbosity and the Notebook reached 44 MB. I didn't realize it would have this effect. Thank you !Giddens
L
3

Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line. Run this command -> nbstripout FILE.ipynb Install nbstripout if it is not there https://pypi.org/project/nbstripout/

Lavona answered 9/7, 2020 at 3:49 Comment(0)
F
1

It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".

I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.

If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.

Furnary answered 26/10, 2021 at 8:18 Comment(0)
M
0

I got the same issue. The file size became 150 MB and the message displayed as "Out Of Memory". I solved the problem using "PyCharm". Please use the following steps:

  1. Download the file from Jupyter Notebook and open it in "PyCharm" editor in "Light Edit Mode".
  2. In "Light Edit Mode", You can get the code cells easily than the "Notepad++".
  3. you can copy and paste and save the code in a new file. Now, the memory size will be reduced. You can load and run the program easily in Jupyter Notebook.
Macaronic answered 7/12, 2023 at 12:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.