How do I determine the size of an object in Python?
Asked Answered
L

17

1046

How do I get the size occupied in memory by an object in Python?

Lovage answered 16/1, 2009 at 5:7 Comment(0)
O
936

Just use the sys.getsizeof function defined in the sys module.

sys.getsizeof(object[, default]):

Return the size of an object in bytes. The object can be any type of object. All built-in objects will return correct results, but this does not have to hold true for third-party extensions as it is implementation specific.

Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to.

The default argument allows to define a value which will be returned if the object type does not provide means to retrieve the size and would cause a TypeError.

getsizeof calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector.

See recursive sizeof recipe for an example of using getsizeof() recursively to find the size of containers and all their contents.

Usage example, in python 3.0:

>>> import sys
>>> x = 2
>>> sys.getsizeof(x)
24
>>> sys.getsizeof(sys.getsizeof)
32
>>> sys.getsizeof('this')
38
>>> sys.getsizeof('this also')
48

If you are in python < 2.6 and don't have sys.getsizeof you can use this extensive module instead. Never used it though.

Ootid answered 16/1, 2009 at 10:42 Comment(7)
Please add to the disclaimer that it will not hold true for nested objects or nested dicts or dicts in lists etc.Lockwood
Umm... sys.sizeof(c) is 32 even when the class instance has 50 attributes! Something seems off! Try this: d = {k: v for k, v in zip('ABCDEFGHIJKLMNOPQRSTUVWXYabcdefghijklmnopqrstuvwxy', range(50))} class C(object): def __init__(self, **kwargs): _ = {setattr(self, k, v) for k, v in kwargs.items()} c = C(**d) sys.getsizeof(d) 1676 sys.getsizeof(c) 32Severn
@Severn that's because every object only uses 32 bytes!! The rest are references to other objects. If you want to account for the referenced objects you have to define __sizeof__ method for your class. The built-in dict python class does define it, that's why you get the correct result when using object of type dict.Ootid
The disclaimer and exceptions to this working cover almost all use cases making the getsizeof function of little value out of the box.Nickeliferous
why is the integer 2 stored in 24 bytes?Agustinaah
@SaherAhwal it is not just an integer, but a full object with methods, attributes, addresses...Ootid
For complex obj, it is not right.Committal
B
623

How do I determine the size of an object in Python?

The answer, "Just use sys.getsizeof", is not a complete answer.

That answer does work for builtin objects directly, but it does not account for what those objects may contain, specifically, what types, such as custom objects, tuples, lists, dicts, and sets contain. They can contain instances each other, as well as numbers, strings and other objects.

A More Complete Answer

Using 64-bit Python 3.6 from the Anaconda distribution, with sys.getsizeof, I have determined the minimum size of the following objects, and note that sets and dicts preallocate space so empty ones don't grow again until after a set amount (which may vary by implementation of the language):

Python 3:

Empty
Bytes  type        scaling notes
28     int         +4 bytes about every 30 powers of 2
37     bytes       +1 byte per additional byte
49     str         +1-4 per additional character (depending on max width)
48     tuple       +8 per additional item
64     list        +8 for each additional
224    set         5th increases to 736; 21nd, 2272; 85th, 8416; 341, 32992
240    dict        6th increases to 368; 22nd, 1184; 43rd, 2280; 86th, 4704; 171st, 9320
136    func def    does not include default args and other attrs
1056   class def   no slots 
56     class inst  has a __dict__ attr, same scaling as dict above
888    class def   with slots
16     __slots__   seems to store in mutable tuple-like structure
                   first slot grows to 48, and so on.

How do you interpret this? Well say you have a set with 10 items in it. If each item is 100 bytes each, how big is the whole data structure? The set is 736 itself because it has sized up one time to 736 bytes. Then you add the size of the items, so that's 1736 bytes in total

Some caveats for function and class definitions:

Note each class definition has a proxy __dict__ (48 bytes) structure for class attrs. Each slot has a descriptor (like a property) in the class definition.

Slotted instances start out with 48 bytes on their first element, and increase by 8 each additional. Only empty slotted objects have 16 bytes, and an instance with no data makes very little sense.

Also, each function definition has code objects, maybe docstrings, and other possible attributes, even a __dict__.

Also note that we use sys.getsizeof() because we care about the marginal space usage, which includes the garbage collection overhead for the object, from the docs:

getsizeof() calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector.

Also note that resizing lists (e.g. repetitively appending to them) causes them to preallocate space, similarly to sets and dicts. From the listobj.c source code:

    /* This over-allocates proportional to the list size, making room
     * for additional growth.  The over-allocation is mild, but is
     * enough to give linear-time amortized behavior over a long
     * sequence of appends() in the presence of a poorly-performing
     * system realloc().
     * The growth pattern is:  0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
     * Note: new_allocated won't overflow because the largest possible value
     *       is PY_SSIZE_T_MAX * (9 / 8) + 6 which always fits in a size_t.
     */
    new_allocated = (size_t)newsize + (newsize >> 3) + (newsize < 9 ? 3 : 6);

Historical data

Python 2.7 analysis, confirmed with guppy.hpy and sys.getsizeof:

Bytes  type        empty + scaling notes
24     int         NA
28     long        NA
37     str         + 1 byte per additional character
52     unicode     + 4 bytes per additional character
56     tuple       + 8 bytes per additional item
72     list        + 32 for first, 8 for each additional
232    set         sixth item increases to 744; 22nd, 2280; 86th, 8424
280    dict        sixth item increases to 1048; 22nd, 3352; 86th, 12568 *
120    func def    does not include default args and other attrs
64     class inst  has a __dict__ attr, same scaling as dict above
16     __slots__   class with slots has no dict, seems to store in 
                    mutable tuple-like structure.
904    class def   has a proxy __dict__ structure for class attrs
104    old class   makes sense, less stuff, has real dict though.

Note that dictionaries (but not sets) got a more compact representation in Python 3.6

I think 8 bytes per additional item to reference makes a lot of sense on a 64 bit machine. Those 8 bytes point to the place in memory the contained item is at. The 4 bytes are fixed width for unicode in Python 2, if I recall correctly, but in Python 3, str becomes a unicode of width equal to the max width of the characters.

And for more on slots, see this answer.

A More Complete Function

We want a function that searches the elements in lists, tuples, sets, dicts, obj.__dict__'s, and obj.__slots__, as well as other things we may not have yet thought of.

We want to rely on gc.get_referents to do this search because it works at the C level (making it very fast). The downside is that get_referents can return redundant members, so we need to ensure we don't double count.

Classes, modules, and functions are singletons - they exist one time in memory. We're not so interested in their size, as there's not much we can do about them - they're a part of the program. So we'll avoid counting them if they happen to be referenced.

We're going to use a blacklist of types so we don't include the entire program in our size count.

import sys
from types import ModuleType, FunctionType
from gc import get_referents

# Custom objects know their class.
# Function objects seem to know way too much, including modules.
# Exclude modules as well.
BLACKLIST = type, ModuleType, FunctionType


def getsize(obj):
    """sum size of object & members."""
    if isinstance(obj, BLACKLIST):
        raise TypeError('getsize() does not take argument of type: '+ str(type(obj)))
    seen_ids = set()
    size = 0
    objects = [obj]
    while objects:
        need_referents = []
        for obj in objects:
            if not isinstance(obj, BLACKLIST) and id(obj) not in seen_ids:
                seen_ids.add(id(obj))
                size += sys.getsizeof(obj)
                need_referents.append(obj)
        objects = get_referents(*need_referents)
    return size

To contrast this with the following whitelisted function, most objects know how to traverse themselves for the purposes of garbage collection (which is approximately what we're looking for when we want to know how expensive in memory certain objects are. This functionality is used by gc.get_referents.) However, this measure is going to be much more expansive in scope than we intended if we are not careful.

For example, functions know quite a lot about the modules they are created in.

Another point of contrast is that strings that are keys in dictionaries are usually interned so they are not duplicated. Checking for id(key) will also allow us to avoid counting duplicates, which we do in the next section. The blacklist solution skips counting keys that are strings altogether.

Whitelisted Types, Recursive visitor

To cover most of these types myself, instead of relying on the gc module, I wrote this recursive function to try to estimate the size of most Python objects, including most builtins, types in the collections module, and custom types (slotted and otherwise).

This sort of function gives much more fine-grained control over the types we're going to count for memory usage, but has the danger of leaving important types out:

import sys
from numbers import Number
from collections import deque
from collections.abc import Set, Mapping


ZERO_DEPTH_BASES = (str, bytes, Number, range, bytearray)


def getsize(obj_0):
    """Recursively iterate to sum size of object & members."""
    _seen_ids = set()
    def inner(obj):
        obj_id = id(obj)
        if obj_id in _seen_ids:
            return 0
        _seen_ids.add(obj_id)
        size = sys.getsizeof(obj)
        if isinstance(obj, ZERO_DEPTH_BASES):
            pass # bypass remaining control flow and return
        elif isinstance(obj, (tuple, list, Set, deque)):
            size += sum(inner(i) for i in obj)
        elif isinstance(obj, Mapping) or hasattr(obj, 'items'):
            size += sum(inner(k) + inner(v) for k, v in getattr(obj, 'items')())
        # Check for custom object instances - may subclass above too
        if hasattr(obj, '__dict__'):
            size += inner(vars(obj))
        if hasattr(obj, '__slots__'): # can have __slots__ with __dict__
            size += sum(inner(getattr(obj, s)) for s in obj.__slots__ if hasattr(obj, s))
        return size
    return inner(obj_0)

And I tested it rather casually (I should unittest it):

>>> getsize(['a', tuple('bcd'), Foo()])
344
>>> getsize(Foo())
16
>>> getsize(tuple('bcd'))
194
>>> getsize(['a', tuple('bcd'), Foo(), {'foo': 'bar', 'baz': 'bar'}])
752
>>> getsize({'foo': 'bar', 'baz': 'bar'})
400
>>> getsize({})
280
>>> getsize({'foo':'bar'})
360
>>> getsize('foo')
40
>>> class Bar():
...     def baz():
...         pass
>>> getsize(Bar())
352
>>> getsize(Bar().__dict__)
280
>>> sys.getsizeof(Bar())
72
>>> getsize(Bar.__dict__)
872
>>> sys.getsizeof(Bar.__dict__)
280

This implementation breaks down on class definitions and function definitions because we don't go after all of their attributes, but since they should only exist once in memory for the process, their size really doesn't matter too much.

Booker answered 19/5, 2015 at 4:26 Comment(3)
Any custom object implemented in C that doesn't properly implement __sizeof__ will not work with sys.getsizeof, and this is not well-documented because it is considered an implementation detail (see bugs.python.org/issue15436). Don't expect this function to cover everything - modify it as needed to best suit your use-cases.Booker
strs that contain non-ASCII characters have more overhead, for example, sys.getsizeof('я') is 76 and sys.getsizeof('😀') is 80.Classify
I bet the difference in 2 bytes is because the terminating null also takes 4 bytes in the second caseClassify
W
177

The Pympler package's asizeof module can do this.

Use as follows:

from pympler import asizeof
asizeof.asizeof(my_object)

Unlike sys.getsizeof, it works for your self-created objects. It even works with numpy.

>>> asizeof.asizeof(tuple('bcd'))
200
>>> asizeof.asizeof({'foo': 'bar', 'baz': 'bar'})
400
>>> asizeof.asizeof({})
280
>>> asizeof.asizeof({'foo':'bar'})
360
>>> asizeof.asizeof('foo')
40
>>> asizeof.asizeof(Bar())
352
>>> asizeof.asizeof(Bar().__dict__)
280
>>> A = rand(10)
>>> B = rand(10000)
>>> asizeof.asizeof(A)
176
>>> asizeof.asizeof(B)
80096

And if you need other view on live data, Pympler's

module muppy is used for on-line monitoring of a Python application and module Class Tracker provides off-line analysis of the lifetime of selected Python objects.

Wien answered 10/11, 2015 at 14:1 Comment(2)
@serv-inc: Let me know if you need me to test it, because otherwise my program does not appear to have a memory leak so there is no urgent need for me.Sperry
@ihavenoidea: bytes (just imagine every python object taking 280 kbytes)Wien
R
89

For numpy arrays, getsizeof doesn't work - for me it always returns 40 for some reason:

from pylab import *
from sys import getsizeof
A = rand(10)
B = rand(10000)

Then (in ipython):

In [64]: getsizeof(A)
Out[64]: 40

In [65]: getsizeof(B)
Out[65]: 40

Happily, though:

In [66]: A.nbytes
Out[66]: 80

In [67]: B.nbytes
Out[67]: 80000
Repulsion answered 30/7, 2010 at 16:33 Comment(4)
>All built-in objects will return correct results, but this does not have to hold true for third-party extensions as it is implementation specific. docs.python.org/library/sys.html#sys.getsizeofPrecautious
"If you are using a numpy array (docs.scipy.org/doc/numpy/reference/arrays.ndarray.html) then you can use the attribute 'ndarray.nbytes' to evaluate its size in memory." https://mcmap.net/q/18892/-how-can-i-check-the-memory-usage-of-objects-in-ipythonOmari
I would guess 40 bytes is correct, however getsizeof() only gives you the size of the object (the header of the array), not of the data inside. Same for python containers where sys.getsizeof([1,2,4]) == sys.getsizeof([1,123**456,4]) == 48, while sys.getsizeof(123**456) = 436Gandhiism
It appears the getsizeof() function was changed at some point to return the expected value.Laurilaurianne
I
86

You can serialize the object to derive a measure that is closely related to the size of the object:

import pickle

## let o be the object whose size you want to measure
size_estimate = len(pickle.dumps(o))

If you want to measure objects that cannot be pickled (e.g. because of lambda expressions) dill or cloudpickle can be a solution.

Incriminate answered 7/12, 2019 at 16:25 Comment(4)
I find this simplest and most useful, especially because I care about Python object size the most when I need to serialize it (for multiprocess etc)Handler
Does not work when a numpy slice is hogging memory. Like in import numpy as np; a = np.arange(100000000); b = a[2:4]; del a; len(pickle.dumps(b)) # 150, but the array is 100MB or more depending on the dtypeBittersweet
Another case when this does not work: TypeError: cannot pickle '_thread.lock' object -- will try dill/cloudpickle as suggested!Outpouring
What is the unit of measure here? bytes?Introduce
H
36

Use sys.getsizeof() if you DON'T want to include sizes of linked (nested) objects.

However, if you want to count sub-objects nested in lists, dicts, sets, tuples - and usually THIS is what you're looking for - use the recursive deep sizeof() function as shown below:

import sys
def sizeof(obj):
    size = sys.getsizeof(obj)
    if isinstance(obj, dict): return size + sum(map(sizeof, obj.keys())) + sum(map(sizeof, obj.values()))
    if isinstance(obj, (list, tuple, set, frozenset)): return size + sum(map(sizeof, obj))
    return size

You can also find this function in the nifty toolbox, together with many other useful one-liners:

https://github.com/mwojnars/nifty/blob/master/util.py

Harleigh answered 21/11, 2019 at 16:24 Comment(1)
Does not work when a numpy slice is hogging memory. Like in import numpy as np; a = np.arange(100000000); b = a[2:4]; del a; len(pickle.dumps(b)) # 150, but the array is 100MB or more depending on the dtypeBittersweet
V
29

Python 3.8 (Q1 2019) will change some of the results of sys.getsizeof, as announced here by Raymond Hettinger:

Python containers are 8 bytes smaller on 64-bit builds.

tuple ()  48 -> 40       
list  []  64 ->56
set()    224 -> 216
dict  {} 240 -> 232

This comes after issue 33597 and Inada Naoki (methane)'s work around Compact PyGC_Head, and PR 7043

This idea reduces PyGC_Head size to two words.

Currently, PyGC_Head takes three words; gc_prev, gc_next, and gc_refcnt.

  • gc_refcnt is used when collecting, for trial deletion.
  • gc_prev is used for tracking and untracking.

So if we can avoid tracking/untracking while trial deletion, gc_prev and gc_refcnt can share same memory space.

See commit d5c875b:

Removed one Py_ssize_t member from PyGC_Head.
All GC tracked objects (e.g. tuple, list, dict) size is reduced 4 or 8 bytes.

Vacant answered 19/2, 2019 at 17:4 Comment(0)
S
16

This can be more complicated than it looks depending on how you want to count things. For instance, if you have a list of ints, do you want the size of the list containing the references to the ints? (i.e. - list only, not what is contained in it), or do you want to include the actual data pointed to, in which case you need to deal with duplicate references, and how to prevent double-counting when two objects contain references to the same object.

You may want to take a look at one of the python memory profilers, such as pysizer to see if they meet your needs.

Statics answered 16/1, 2009 at 13:0 Comment(0)
R
14

Having run into this problem many times myself, I wrote up a small function (inspired by @aaron-hall's answer) & tests that does what I would have expected sys.getsizeof to do:

https://github.com/bosswissam/pysize

If you're interested in the backstory, here it is

EDIT: Attaching the code below for easy reference. To see the most up-to-date code, please check the github link.

import sys

def get_size(obj, seen=None):
    """Recursively finds size of objects"""
    size = sys.getsizeof(obj)
    if seen is None:
        seen = set()
    obj_id = id(obj)
    if obj_id in seen:
        return 0
    # Important mark as seen *before* entering recursion to gracefully handle
    # self-referential objects
    seen.add(obj_id)
    if isinstance(obj, dict):
        size += sum([get_size(v, seen) for v in obj.values()])
        size += sum([get_size(k, seen) for k in obj.keys()])
    elif hasattr(obj, '__dict__'):
        size += get_size(obj.__dict__, seen)
    elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
        size += sum([get_size(i, seen) for i in obj])
    return size
Rhapsodist answered 21/7, 2016 at 22:21 Comment(1)
Crashes with "TypeError: 'Int64Index' object is not callable" on pd.SeriesMaas
B
7

Here is a quick script I wrote based on the previous answers to list sizes of all variables

for i in dir():
    print (i, sys.getsizeof(eval(i)) )
Barrera answered 4/3, 2013 at 17:34 Comment(2)
It is not wrong, it is ambiguous. sys.getsizeof will always return value is needed, so there is no need to loose performance with try..except.Premium
oh, that's a good point and I didn't think about it - the code in the form it is right now just shows how it was chronologically written - first I knew about numpy (hence nbytes), then I looked up a more generic solution. Thank you for the explanation _/\_Barrera
F
6

Use following function to get actual size of a python object:

import sys
import gc

def actualsize(input_obj):
    memory_size = 0
    ids = set()
    objects = [input_obj]
    while objects:
        new = []
        for obj in objects:
            if id(obj) not in ids:
                ids.add(id(obj))
                memory_size += sys.getsizeof(obj)
                new.append(obj)
        objects = gc.get_referents(*new)
    return memory_size

actualsize([1, 2, [3, 4, 5, 1]])

Reference: https://towardsdatascience.com/the-strange-size-of-python-objects-in-memory-ce87bdfbb97f

Fuge answered 31/7, 2021 at 9:45 Comment(5)
This seemed to give a more meaningful answer for class instances than other answers. However, with an array of class instances, this reported almost the same size for a single item as for all items -- not sure why.Breeks
actualsize() for just the simplest NamedTuple you can think of gives 19+ MB(!). Any idea what the function is counting here?Lennielenno
Can you give an example NamedTupleFuge
@AmanGupta from collections import namedtuple; nt = namedtuple("nt", ["a", "b"]); print(f"{actualsize(nt(3, 'Hello')):,}") # 19,264,817 seems to count the module code, too...Lennielenno
It probably goes quite deep into references of all libraries used etc. I got also insane sizes for simple objects. actualsize(namedtuple("a", "a b c")(1, 2, 3)) or actualsize(Path())Portemonnaie
A
5

If you don't need the exact size of the object but roughly to know how big it is, one quick (and dirty) way is to let the program run, sleep for an extended period of time, and check the memory usage (ex: Mac's activity monitor) by this particular python process. This would be effective when you are trying to find the size of one single large object in a python process. For example, I recently wanted to check the memory usage of a new data structure and compare it with that of Python's set data structure. First I wrote the elements (words from a large public domain book) to a set, then checked the size of the process, and then did the same thing with the other data structure. I found out the Python process with a set is taking twice as much memory as the new data structure. Again, you wouldn't be able to exactly say the memory used by the process is equal to the size of the object. As the size of the object gets large, this becomes close as the memory consumed by the rest of the process becomes negligible compared to the size of the object you are trying to monitor.

Adulterant answered 10/4, 2019 at 20:58 Comment(2)
The question asks how to do it in python, not just finding memory usage of python objects, and using a Mac's activity monitor or any other similar software isn't programmatically using python. That being said, checking memory usage of python processes in this way generally is a good way to make sure nothing has gone wrong...Devolution
@TomWyllie, Thanks, but downvoting this answer carries the negative connotation that the answer itself is wrong and accomplishes nothing. The method I mention might not be implemented in Python, but it is a handy way to get a rough estimate of a size of a Python object. I knew I am not answering the exact question, however, the method could be useful for someone else to get a similar result.Phonsa
E
4

If performance is not an Issue, the easiest solution is to pickle and measure:

import pickle

data = ...
len(pickle.dumps(data))
Empale answered 24/12, 2021 at 13:51 Comment(3)
does this work ? why not any upvote to this ?Maudiemaudlin
@Maudiemaudlin - Why no upvotes? Because this solution was already posted two years prior. Therefore the original answer is (rightfully) getting the votes.Asiaasian
usually I upvote every solution that works. Shouldnt I ?Maudiemaudlin
A
2

I use this trick... May won't be accurate on small objects, but I think it's much more accurate for a complex object (like pygame surface) rather than sys.getsizeof()

import pygame as pg
import os
import psutil
import time


process = psutil.Process(os.getpid())
pg.init()    
vocab = ['hello', 'me', 'you', 'she', 'he', 'they', 'we',
         'should', 'why?', 'necessarily', 'do', 'that']

font = pg.font.SysFont("monospace", 100, True)

dct = {}

newMem = process.memory_info().rss  # don't mind this line
Str = f'store ' + f'Nothing \tsurface use about '.expandtabs(15) + \
      f'0\t bytes'.expandtabs(9)  # don't mind this assignment too

usedMem = process.memory_info().rss

for word in vocab:
    dct[word] = font.render(word, True, pg.Color("#000000"))

    time.sleep(0.1)  # wait a moment

    # get total used memory of this script:
    newMem = process.memory_info().rss
    Str = f'store ' + f'{word}\tsurface use about '.expandtabs(15) + \
          f'{newMem - usedMem}\t bytes'.expandtabs(9)

    print(Str)
    usedMem = newMem

On my windows 10, python 3.7.3, the output is:

store hello          surface use about 225280    bytes
store me             surface use about 61440     bytes
store you            surface use about 94208     bytes
store she            surface use about 81920     bytes
store he             surface use about 53248     bytes
store they           surface use about 114688    bytes
store we             surface use about 57344     bytes
store should         surface use about 172032    bytes
store why?           surface use about 110592    bytes
store necessarily    surface use about 311296    bytes
store do             surface use about 57344     bytes
store that           surface use about 110592    bytes
Acquittal answered 21/4, 2020 at 5:11 Comment(0)
S
1

This might not be the most relevant answer, but I was interested only in object storage and retrieval. So dumping the object as pickle and checking the pickle's size was sufficient

Serafina answered 15/10, 2022 at 7:16 Comment(0)
T
0
import io
import torch
import sys

def get_size(obj):
    buffer = io.BytesIO()
    torch.save(obj, buffer)
    return sys.getsizeof(buffer)


# Let's test by creating some unusual object

obj = []
import types
import numpy
for i in range(5):
    namespace = types.SimpleNamespace()
    namespace.text = 'hi stack overflow'
    namespace.array= numpy.arange(100000)
    namespace.torchy=torch.randn((2,5,6,7,7,3,4))
    obj.append(namespace)


print(get_size(obj))
Thaliathalidomide answered 20/2 at 12:31 Comment(0)
C
-1

You can make use of getSizeof() as mentioned below to determine the size of an object

import sys
str1 = "one"
int_element=5
print("Memory size of '"+str1+"' = "+str(sys.getsizeof(str1))+ " bytes")
print("Memory size of '"+ str(int_element)+"' = "+str(sys.getsizeof(int_element))+ " bytes")
Confute answered 17/11, 2019 at 12:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.