Overhead to using std::vector?
Asked Answered
E

8

20

I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector?

To give a crude example, if I had to store an array of n integers, where n <= 16, say. I could implement it using

int* data = new int[n]; //assuming n is set beforehand

or using a vector:

std::vector<int> data;

Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?

Enrollee answered 8/3, 2013 at 12:35 Comment(2)
You don't have to push_back. std::vector<int>(n) is almost equivalent to your dynamic array version, except that the n integers are value, hence zero, initialized in the vector.Nicknickel
@juanchopanza: Fair point. I removed the push_back part. It wasn't supposed to be part of the comparision.Enrollee
S
16

It is always better to use std::vector/std::array, at least until you can conclusively prove (through profiling) that the T* a = new T[100]; solution is considerably faster in your specific situation. This is unlikely to happen: vector/array is an extremely thin layer around a plain old array. There is some overhead to bounds checking with vector::at, but you can circumvent that by using operator[].

Springhead answered 8/3, 2013 at 12:40 Comment(4)
The usual reason for using C style arrays has nothing to do with speed; it's for static initialization, and for the compiler to determine the size according to the number of initializers. (Which, of course, never applies to dynamically allocated arrays).Confucius
@James If I'm reading your comment correctly, you are objecting to the fact that I seem to be bashing C-style arrays without saying that I mean dynamically allocated ones? If so, I have edited my answer regarding this. (Also, +1 to your answer.)Springhead
That clears it up. I didn't know that vector/array is a thin layer. I kinda assumed that with all the functionality, it must have a significant overhead.Enrollee
You said "It is always...until...solution is considerably faster". I didn't read it as being restricted to dynamic allocation. (As I said in my answer, I have never used an array new. Before std::vector and std::string, the first thing one did was to write something equivalent.) But while I never use array new, there are cases where C style arrays are justified (some, but not all of which can be replaced by std::array in C++11).Confucius
C
10

I can't think of any case where dynamically allocating a C style vector makes sense. (I've been working in C++ for over 25 years, and I've yet to use new[].) Usually, if I know the size up front, I'll use something like:

std::vector<int> data( n );

to get an already sized vector, rather than using push_back.

Of course, if n is very small and is known at compile time, I'll use std::array (if I have access to C++11), or even a C style array, and just create the object on the stack, with no dynamic allocation. (Such cases seem to be rare in the code I work on; small fixed size arrays tend to be members of classes. Where I do occasionally use a C style array.)

Confucius answered 8/3, 2013 at 12:58 Comment(0)
I
4

If you know the size in advance (especially at compile time), and don't need the dynamic re-sizing abilities of std::vector, then using something simpler is fine.

However, that something should preferably be std::array if you have C++11, or something like boost::scoped_array otherwise.

I doubt there'll be much efficiency gain unless it significantly reduces code size or something, but it's more expressive which is worthwhile anyway.

Inapprehensible answered 8/3, 2013 at 12:38 Comment(0)
V
4

You should try to avoid C-style-arrays in C++ whenever possible. The STL provides containers which usually suffice for every need. Just imagine reallocation for an array or deleting elements out of its middle. The container shields you from handling this, while you would have to take care of it for yourself, and if you haven't done this a hundred times it is quite error-prone.
An exception is of course, if you are adressing low-level-issues which might not be able to cope with STL-containers.

There have already been some discussion about this topic. See here on SO.

Verdi answered 8/3, 2013 at 12:41 Comment(1)
+1 for the link at the end, that should destroy once and for all the myth that accessing vector elements is somehow slow.Springhead
R
3

Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?

Call me a simpleton, but 99.9999...% of the times I would just use a standard container. The default choice should be std::vector, but also std::deque<> could be a reasonable option sometimes. If the size is known at compile-time, opt for std::array<>, which is a lightweight, safe wrapper of C-style arrays which introduces zero overhead.

Standard containers expose member functions to specify the initial reserved amount of memory, so you won't have troubles with reallocations, and you won't have to remember delete[]ing your array. I honestly do not see why one should use manual memory management.

Efficiency shouldn't be an issue, since you have throwing and non-throwing member functions to access the contained elements, so you have a choice whether to favor safety or performance.

Revanchism answered 8/3, 2013 at 12:42 Comment(0)
T
2

std::vector could be constructed with an size_type parameter that instantiate the vector with the specified number of elements and that does a single dynamic allocation (same as your array) and also you can use reserve to decrease the number of re-allocations over the usage time.

Teriteria answered 8/3, 2013 at 12:43 Comment(0)
V
1

In n is known at compile-time, then you should choose std::array as:

std::array<int, n> data; //n is compile-time constant

and if n is not known at compile-time, OR the array might grow at runtime, then go for std::vector:

std::vector<int> data(n); //n may be known at runtime 

Or in some cases, you may also prefer std::deque which is faster than std::vector in some scenario. See these:

Hope that helps.

Volny answered 8/3, 2013 at 12:42 Comment(1)
Unless you know that n is very, very small, you probably shouldn't declare local variables as std::array. Unless there is some very specific reason for doing otherwise, I'd just use std::vector---if I know the size, I'll initialize the vector with the correct size. (This also supposes that the type has a default constructor.)Confucius
I
1

From a perspective of someone who often works with low level code with C++, std vectors are really just helper methods with a safety net for a classic C style array. The only overheads you'd experience realistically are memory allocations and safety checks for boundaries. If you're writing a program which needs performance and are going to be using vectors as a regular array I'd recommend to just use C style arrays instead of vectors. You should realistically be vetting the data that comes into the application and check the boundaries yourself to avoid checks on every memory access to the array.

It's good to see that others are checking the differences of the C ways and the C++ ways. More often than not C++ standard methods have significantly worse performance and uglier syntax than their C counterparts and is generally the reason people call C++ bloated. I think C++ focuses more on safety and making the language more like JavaScript/C# even though the language fundamentally lacks the foundation to be one.

Implant answered 11/7, 2022 at 21:32 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.