How to get pytorch's memory stats on CPU / main memory?
Asked Answered
C

1

9

Sometimes you need to know how much memory does your program need during it's peak, but might not care a lot about when exactly this peak occurs and how long etc. Pytorch has this nice tool for reporting your memory usage when running on a gpu, which you only have to call once at the end of the program:

memory_usage = torch.cuda.memory_stats()["allocated_bytes.all.peak"]
torch.cuda.reset_peak_memory_stats()

This code is extremely easy, cause it relieves you from running a separate thread watching your memory every millisecond and finding the peak.

Now my question is: Why does this only work for the GPU? I couldn't find something like torch.cpu.memory_stats(). What is the pendant for this when running on a CPU?

Chalet answered 19/2, 2022 at 20:23 Comment(0)
L
2

For this you want to use Pytorch Profiler which give you details on both CPU and memory consumption.

For more details:

https://pytorch.org/blog/introducing-pytorch-profiler-the-new-and-improved-performance-tool/

https://pytorch.org/tutorials/recipes/recipes/profiler_recipe.html

Linoleum answered 6/10, 2022 at 9:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.