Time complexity of delete[] operator [duplicate]
Asked Answered
S

4

42

What is the Time Complexity of the delete[] operator?

I mean how is it implemented - does it iterate over all the elements in the array and calls destructor for every element?

Does this operator do the same for primitive types (int, etc.) and user defined types?

Salvatore answered 12/1, 2014 at 16:27 Comment(8)
Awesome question. Love it.Doctor
Voted for reopening! 1. The question has more upvotes than the 'duplicate' 2. @BasileStarynkevitch's answer cites standard documentation and is more sound (IMHO) than the answers in the 'duplicate'Placeman
@πάνταῥεῖ then flag to have the answers merged. Reopening this one and trying to get the other closed is too cumbersome.Humfried
@πάνταῥεῖ I don't agree. Sorry. The number of votes or the "quality" of the answers don't change the rules. Answers should be merged with the dupe question as per standard procedure.Denationalize
@πάνταῥεῖ: The accepted answer to the other question has the advantage of actually correctly answering the question, a virtue not shared by Basile's answer (which makes some true comments and dodges the complexity).Moll
For primitive types this is the same as free()'s complexity. As for other types, add to this the cost of each destructor. So to improve delete[] efficiency be sure to: 1) optimize your destuctors, avoid costly operations and try to factorize it out of the class if many instances get deleted. 2) allocate space wisely to prevent memory fragmentation, if you allocate objects[512] and then objects[2], when freeing objects[512] you will get an overhead if you need too allocate objects[513] (this is speculation, it depends on free()'s behaviour but you get my point)Aliber
@Jefffrey Sorry, I didn't know that such standard procedure exists to merge answers on dupes, and how to initiate it.Placeman
This can't possibly be answered without specifying a particular runtime and perhaps even compiler.Snick
B
33

::operator delete[] is documented on cplusplus.com (which is sometimes frowned upon) as:

operator delete[] can be called explicitly as a regular function, but in C++, delete[] is an operator with a very specific behavior: An expression with the delete[] operator, first calls the appropriate destructors for each element in the array (if these are of a class type), and then calls function operator delete[] (i.e., this function) to release the storage.

so the destructor is called n times (once for each element), and then the memory freeing "function" is called once.

Notice that each destruction might take a different time (or even complexity) than the others. Generally most destructions are quick, and have the same complexity.... But that won't be the case if each destroyed element is a complex tree or node or graph...

For primitive types like int the fictitious destructor of int is a no-op. The compiler probably would optimize that (if asked).

You should check the real C++11 standard, or at least its late n3337 working draft, which says (thanks to Matteo Italia for pointing that in a comment) in §5.3.5.6 page 110 of n3337:

If the value of the operand of the delete-expression is not a null pointer value, the delete-expression will invoke the destructor (if any) for the object or the elements of the array being deleted. In the case of an array, the elements will be destroyed in order of decreasing address (that is, in reverse order of the completion of their constructor; see 12.6.2)

If you use -and trust enough- GCC 4.8 or better, you could have used the g++ compiler with the -fdump-tree-phiopt or -fdump-tree-all option (beware, they are dumping a lot of files!), or the MELT plugin, to query the intermediate Gimple representation of some example. Or use -S -fverbose-asm to get the assembler code. And you also want to add optimization flags like -O1 or -O2 ...

NB: IMHO, cppreference.com is also an interesting site about C++, see there about delete (as commented by Cubbi)

Bristow answered 12/1, 2014 at 16:27 Comment(10)
+1, but although that quotation is probably accurate, people will surely point out that cplusplus.com is both non-normative and often error-ridden, you should probably replace it with one from the actual standard.Ilocano
I can't in a few minutes find the exact sentence inside C++11 n3337 ; I'm sure it is there (just a lack of time & motivation from my part)Bristow
It's at §5.3 ¶6: "[...] the delete-expression will invoke the destructor (if any) for [...] the elements of the array being deleted. In the case of an array, the elements will be destroyed in order of decreasing address (that is, in reverse order of the completion of their constructor; see 12.6.2)."Ilocano
@MatteoItalia: Thanks, improved answer (and cited your help). To be picky it is in §5.3.5.6Bristow
-1 for a cplusplus.com reference.Nefen
@Abyx: why, since I also cited the standard (which is more precise, but probably less understandable for a newbie).Bristow
well that "is documented on cplusplus.com" just made my hand to instantly move mouse pointer to the "downvote" button, because uhm I maybe could tolerate a reference to this infamous site but saying that it documents something - it's a bit too much. Also all those "very specific behavior" milk-and-water is so... cplusplus. I can't see how it can be useful in this answer, especially when you quote the Standard.Nefen
the cppreference page for delete-expression is en.cppreference.com/w/cpp/language/deleteCardona
@Cubbi: Thanks. Improved answer.Bristow
what is the time complexity of delete for very fragmented memory for common implementations(gcc or visual c) Assume i did 100000 new and 90000 delete, is it dependend to fragmentation like new? I dont see answers to this... Well also can i use it for hard real time applications? Is it defined in c++ standarts or implementation dependend?Incongruent
A
11

The implementation of delete and delete[] is composed of two phases:

  1. recursive call to destructors (if any)
  2. memory deallocation for deleted object

Let alone the chain of calls to destructors, whose complexity is essentially governed by you, we are left with how the memory is freed to consider.

The second point is not covered by the C++ specification. So, any compiler suite/OS is free to adopt its own strategy.

A common memory allocation/deallocation strategy is allocating a whole memory page when needed from the OS, then at each new/new[], returning a chunk of the appropriate size, whose length and attributes are then stored inside the page as a header/footer. A corresponding delete/delete[] can be as simple as marking that same chunk as "free", which is clearly O(1).

If the complexity of memory deallocation is O(1), then the complexity of a delete is essentially governed by calls to destructors. The default implementation does (almost) nothing, and it's a O(1) for a single call, thus an overall O(n), where n is the toal number of calls (e.g. if the object being destructed has two fields whose destructor is called, then n = 1 (object) + 2 (o. fields) = 3).

Putting all pieces together: you can arbitrarily increment complexity by performing operations in the destructor (which can be written by you), but you cannot "perform better"¹ than O(n) (n defined in the previous paragraph). The formally correct way to state this is: "the complexity of delete is an Omega(n)".


¹ allow me to be a bit informal on this point

Acridine answered 12/1, 2014 at 16:34 Comment(5)
I believe that the C++ specification does say that the destructor of every element should be called.Bristow
@BasileStarynkevitch But this does not aply for fundamental types in which case O(1) is very likelyTella
@BasileStarynkevitch Yes, I agree. I was more focusing on the part which is not governed by the user itself. Anyway, I fixed the answer.Acridine
I think that complexity of destructors varies is the key here and should really be highlighted.Moll
@BenVoigt I expanded the answer to cover that point :)Acridine
A
1

For class types, the theoretical complexity is O(n). The destructor is called for each element. Of course, it's up to the implementation to adhere to the observable behavior, so if the destructor is a no-op or the behavior is the same as with just marking the whole chunk as freed, the complexity could be just O(1).

For primitive types, the compiler will likely just release the whole chunk of memory at once, thus the complexity O(1).

Astatine answered 12/1, 2014 at 16:37 Comment(3)
Note: given that the complexity of free can vary widely depending on the implementation (compaction of chunks ? thread-caching ?) it is difficult to interpret releasing a block of memory as O(1) without precising what we are counting...Seepage
I suppose you meant "types with trivial/non-trivial destructors" rather than primitive/class types. But the complexity of a non-trivial destructor can be very large, as Stefano points out in his answer.Moll
Nope. The complexity of the individual destructors or the underlying heap can dominate.Snick
C
0

What is the Time Complexity of the delete[] operator?

The amount of the time required is of course implementation defined. However, the operator applies only to the pointer to the 1D array and thus it's O(1).

I mean how is it implemented - does it iterate over all the elements in the array and calls destructor for every element?

Yes.
Provided that it's called only on the exact pointer which is assigned a memory created using new[]. For primitive types there are no user defined destructors.

Does this operator do the same for primitive types (int, etc.) and user defined types?

The comparison is not fair, because primitive types don't have user defined destructor and they cannot be polymorhpic.
For example, delete[] on a polymorphic class is an undefined behavior. i.e.

Base* p1 = new Derived, *p2 = new Derived[2];
delete p1; // ok
delete[] p2;  // bad
Cayuga answered 12/1, 2014 at 18:14 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.