How can I release memory after creating matplotlib figures
Asked Answered
S

2

84

I have several matlpotlib functions rolled into some django-celery tasks.

Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.

QUESTION: How can I release this memory?

UPDATE 2 - A Second Solution:

I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question .clf(), .close(), and gc.collect() aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.

Matplotlib errors result in a memory leak. How can I free up that memory?

UPDATE - The Solution:

These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:

.clf(): Matplotlib runs out of memory when plotting in a loop

.close(): Python matplotlib: memory not being released when specifying figure size

import gc
gc.collect()

Here is the example I used to test the solution:

import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
from pylab import import figure, savefig
import numpy as np
import gc      

a = np.arange(1000000)
b = np.random.randn(1000000)

fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
fig.set_size_inches(10,7)
ax = fig.add_subplot(111)
ax.plot(a, b)

fig.clf()
plt.close()
del a, b
gc.collect()
Simonides answered 18/8, 2011 at 1:28 Comment(1)
In my use case, del a,b; gc.collect(); were not necessary to avoid the memory leakReparation
P
6

Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery? Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).

Phyl answered 18/8, 2011 at 1:40 Comment(1)
Good suggestions. DEBUG was set to false. I ran the function several times in a for loop and it seems the memory leak is associated with my code. Previously I had this function as a django view but but the http request time was too long, so I moved it over to a django-celery task. I think django has good memory clean up for all http requests, which i no longer benefit from now that it is a task. Unfortunately, I don't see where the memory leak is coming from I try to delete the variables I declared in the function to free up that memory when I am done but this does not seem to have any effect.Simonides
H
2
import matplotlib.pyplot as plt
from datetime import datetime
import gc

class MyClass:
    def plotmanytimesandsave(self):
        plt.plot([1, 2, 3])
        ro2 = datetime.now()
        f =ro2.second
        name =str(f)+".jpg"
        plt.savefig(name)
        plt.draw()
        plt.clf()
        plt.close("all")


for y in range(1, 10):
    k = MyClass()
    k.plotmanytimesandsave()
    del k
    k = "now our class object is a string"
    print(k)
    del k
    gc.collect

with this program you will save directly as many times you want without the plt.show() command. And the memory consumption will be low.

Halland answered 23/7, 2020 at 19:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.