Is delete and delete[] equivalent for basic data types
Asked Answered
G

5

6

So during a code review, a colleague of mine used a double* d = new double[foo]; and then called delete d. I told them that they should change it to delete [] d. And they stated that the compiler doesn't need that for basic data types. I disagreed.

So I thought I'd prove my point with an experiment:

#define NUM_VALUES 10000
int main(int argc,char** argv)
{
  int i = 0;
  while (true)
  {
    std::cout << i << "|";
    double* d = new double[NUM_VALUES];
    std::cout << ((void*)d) << std::endl;
    for (int j = 0; j < NUM_VALUES; j++)
      d[j] = j;

    delete d;
    i++;
  }

  return 0;
}

Not only does the memory usage not grow, but d is allocated to the same place every time! (Visual Studio 2010). Is this a quirk of the visual studio compiler? Or is this part of the standard?

Guilbert answered 21/11, 2013 at 16:42 Comment(3)
Your colleague is wrong and he should feel bad.Landahl
possible duplicate of delete vs delete[]Rollet
possible duplicate of Is delete[] equal to delete?Lorenzen
R
10

If you use new you need to use delete

If you use new [] you have to use delete []

They mean different things. double* d = new double(); means allocate and construct a double type. delete d; means unallocated and destruct a single double type. double* d = new double[NUM_VALUES]; means allocate NUM_VALUES doubles and delete [] d means unallocated each of of the NUM_VALUES doubles allocated.

Rollet answered 21/11, 2013 at 16:43 Comment(6)
I agree with you Andre, not disputing that. However, that doesn't explain why my experiment doesn't explode. Is it because the behavior isn't defined?Guilbert
@MadScienceDreams Exactly. deleteing a new[]ed object causes undefined behavior.Landahl
Ah so this is a compiler quirk and should NEVER be relied upon. Thanks.Guilbert
@MadScienceDreams, Undefined means undefined. The compiler is free to output a program that does this when your run it in the morning and something else in the afternoon. You would not believe the absolutely bizarre things compilers will come up with in the name of optimization, when behavior is "undefined".Deth
Follow up question: they asked about the underlying implementation: malloc(foo) is always matched by free(foo), reguardless of size. Why should delete and delete [] be differed? (I didn't have an answer, but I did get them to admit I was right, wooo I-told-you-so-moments)Guilbert
@MadScienceDreams: They're different to allow implementations - and, perhaps more importantly, user-defined replacements - to optimise them separately. A single-object allocator can be significantly more efficient than one which needs to deal with arbitrarily sized allocations.Wingfooted
W
6

I told them that they should change it to delete [] d

You were right. Using the wrong form of delete gives undefined behaviour.

Is this a quirk of the visual studio compiler? Or is this part of the standard?

It's quite likely that an implementation will get the memory from the same place for both "single object" and "array" allocations, in which case using the wrong form of delete for trivial types will appear to work. It's also quite likely that freeing memory then allocating the same amount might reuse the same memory. None of this is guaranteed by the standard, though.

Wingfooted answered 21/11, 2013 at 16:48 Comment(0)
M
1

You are absolutely right in your disagreement.

As it has been stated many times, new/delete and new[]/delete[] are two separate, completely independent ways of dynamic memory management. They cannot be mixed (as in using delete for memory allocated by new[]), regardless of what type they are being used with.

These two mechanisms can easily be physically different, i.e. they can use completely different memory layouts, completely different memory pools and even work with completely different kinds of system-level memory.

All that not even mentioning that at the language level the raw-memory allocation functions used by these mechanisms (operator new() functions and the rest) are independently replaceable, which means that even if thoughtless mixing of these operators appears to "work" for basic types in your implementation, it can easily be broken by raw-memory allocation replacement.

Marybethmaryellen answered 21/11, 2013 at 16:51 Comment(0)
N
1

Firstly the accepted answer to this question delete vs delete[] quotes the standard, saying that the behavior is undefined. This should satisfy your coworker. There is more discussion here: Is delete[] equal to delete?.

If he is not convinced, you may remind him that the global new/delete, as well as, the global new[]/delete[] could be overwritten in those pairs, so even if the new[]/delete pair (for basic types) in the VS2010 implementation do no crash, there is absolutely no guarantee that another implementation will not fail.

For example, we have overwritten the global new[]/delete[] in debug mode to hide 'end of array markers' to verify usage. We certainly expect delete[] to be called on arrays of doubles created with new[] for this to work.

However, because I am on old time C++er, I know the source of your coworkers confusion. Think of the simplest implementation of new/delete and new[]/delete[] using C's malloc/free for the backend and direct calls to the constructors/destructors. It is easy to see that for a simple implementation, as the original C++ implementations were, delete and delete[] were equivalent for types without destructors. A certain folklore has been built up around this, which may be the source of your coworker's statement, but it does not hold up in reality and in fact never did.

Niello answered 21/11, 2013 at 16:56 Comment(1)
Good find. That answer explains it very well.Rollet
T
1

For reference, the reason it appears to work is that in your implementation:

  • They both get their memory from the same source
  • As an optimization, for double (and presumably some other types) new[] doesn't use any memory storing the requested size of the array. For types with destructors delete[] needs to know how many objects to destroy, so that size must be stored somewhere.

The fact that this is how a particular implementation behaves is not a good reason to rely on that behavior.

The main reason that the code is likely to go wrong on some implementation, somewhere, is if new[] allocates space for the array plus space for its size, and returns a pointer to the array. delete[] then would subtract the space used for the size before freeing the allocation, whereas delete would not. Your colleague has chosen to bet on never encountering an implementation that does this with new double[].

This necessary number can not always be deduced by the C++ implementation from the value returned by non-standard functions like malloc_size(), since that returns the size of the block used to satisfy the allocation, not the size requested. So generally one should expect new[] and delete[] to use some trick or other, and it's purely a matter of implementation quality which types they avoid it for.

Even if the implementation is smart enough to avoid the need for extra space with an array of double, it might have a debugging mode that deliberately detects the error anyway.

Tenno answered 21/11, 2013 at 17:0 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.